Show archive of December 2009
December 30, 2009
This year we saw a lot of changes take place in the search space, and as the use of Internet accessing mobile devices becomes more widespread, we’ve seen the question of public access becomes a more important consideration. It turns out that public access to the Internet was an important point, interwoven in many developments throughout 2009. Within the European Union, new rules have been passed that would guarantee high-speed Internet connections for all citizens. In the U.S., a debate is being held about public access to federally funded research and net neutrality and anti-net neutrality legislation is being considered by the U.S. government. The idea of Internet access as a right is gaining attention.
Jim Hedger, WebmasterRadio blogger and host of the show Webcology, then joins Virginia to look at the year in search. Mobile’s big break, emerging search deals, and new features at Google all make the cut of top topics of 2009. Jim also shares some insider info about his favorite WebmasterRadio moments of 2009, as well as what to look forward to from WebmasterRadio in the new year, including video integration and new shows in the line-up.
Wrapping up the last show of the year, Susan, Virginia and Michael Terry look back on what they consider the biggest milestones of 2009. Susan felt the effect of Twitter as a budding means of brand dialogue. Virginia was glad to see that mainstream use of mobile devices to accesses the Internet finally hit maturation. And Michael noted that real-time search results signal a change in how people gather and process information.
December 23, 2009
At the beginning of December, personalized search was rolled out to all Google users, not just those signed in to their Google accounts. Now Web history is being gleaned through an anonymous cookie, and search activity of non-signed in users will be stored for 180 days, thereby having an effect on search results. Bruce Clay, Susan Esparza and Virginia Nussey consider how personalized search results affect the job of an SEO and shape users’ expectations of search results. Bruce gives his recommendations for search engine optimization in a world of personalization and intent-based targeting.
David Harry, blogger and founder of the community forum and discussion hub, SEO Dojo, has performed three rounds of research to gather data on result ranking flux due to personalization. Dave shares his analysis of the latest round of testing, and among his observations he notes the difference of results for informational and transactional searches. He also explains how SEOs might go about optimization in light of personalization. Finally, Dave puts the personalized search piece of the puzzle into a broader point of view, including the way a more powerful infrastructure like Caffeine allows broad customization of search results to occur.
Then Susan, Virginia and Michael Terry look at another recent Google implementation — real-time search results. The question of spam and low-quality information is raised, as is the shift in information gathering and consumption. Some seasoned search marketers expect real-time search results to disappear from Google in the coming months, while others see real-time search as a valuable tool for mining data online.
December 16, 2009
Pay per click advertising is the star of today’s podcast, with Bruce Clay starting off the show reviewing recent PPC news and updates. Product Listing Ads, which provides richer product info and works on a cost per action basis, is now available to select advertisers in beta. Google has suspended its Local Listing Ad service as it analyzes feedback to improve the product in advance of a full launch. The search engine has updated and expanded the Google Advertising Professionals certification program. And the AdWords campaign management tools have been enhanced with new segmentation options and integrated performance data.
Bruce Clay, Inc. PPC analyst Jim Stratton then joins Virginia to talk about his recommendations for using broad match in AdWords campaigns. Rather than avoiding the option altogether, Jim advises constantly reviewing query reports to identify negative keywords, which help advertisers avoid paying for low-quality traffic. He also recommends tailoring ad copy to pre-qualify the search audience. Broad match also holds value as a tool for keyword discovery. They also consider the effect of broad match on campaign QualityScore.
Finally, Jim and Virginia look at an often-overlooked metric of pay per click advertising, revenue per click. Revenue per click can be calculated through a simple formula, or e-commerce companies can link their AdWords and Google Analytics accounts to have the revenue per click provided to them by Google. Along with revenue per click, Jim advises PPC marketers also look at lifetime customer value, as RPC doesn’t account for LCV — another hidden metric of pay per click.
December 9, 2009
Bruce Clay has long shared his passion for search engine optimization tools through the SEOToolSet. On the podcast Bruce gives an insider’s view on the company’s tools as they work hand in hand with Bruce’s SEO methodology. He also previews the new and improved SEO tools scheduled for release next year. With the SEOToolSet, users can simplify the process of site and competitor diagnostics.
Then Aaron Landerkin, Bruce Clay, Inc.’s IT Manager directing the development of the new SEOToolSet, demos the free SEO tools available at SEOTools.com. The recently launched site compiles the company’s freely available SEO tools, including the Domain Report Tool, the Server Response Checker, the SEO Cloaking Checker, the KSP Tool and the SEMToolBar. Aside from the SEMToolBar, the tools on this site are slightly limited version of their subscription-based equivalents in the SEOToolSet.
A complement to the SEOToolSet is the SEO training that instructs attendees on how to use the tools for best effect in partnership with Bruce’s methodology and SEO strategies. Standard SEO training courses are offered almost every month in Simi Valley, California, and the advanced course is offered almost every other month in California. Plus, the SEO training course is offered at many industry conferences, in New York and around the world. Bruce, Susan and Virginia consider some of their favorite qualities and contents of the training course.
December 2, 2009
Ask.com is pursuing the real-time and Q&A segments of search with a new strategy and contagious excitement. In a post titled The Next Frontier in Search: Questions & Answers, Ask.com U.S. president Doug Leeds announced that the company was developing technology to better extract existing answers on the Web as well as to better find and index the source of answers not yet published. The latter will be achieved by identifying human subject matter experts that can be called upon to answer questions when they arise. Bruce Clay expresses concern over how Ask.com identifies an expert. What qualifications must be met by experts? How can webmasters and SEOs optimize their odds of being considered expert?
Doug Leeds then joins the program to answer those very questions and to explore the need for evolving technology in the question and answer space. Doug looks at the shortcomings of search in delivering answers and the different ways people approach search when looking for an answer and when researching a topic. He explains what people can do to prepare their site for Q&A search and for being considered a subject matter expert by Ask.com. Real-time question and answer capabilities through human editorial and participation is part of the strategy Ask.com is taking to evolve their search results.
While Ask.com will seek to extract answers from Web pages as they are today, Susan Esparza, Michael Terry and Virginia Nussey consider whether or not microformats could help the process. Microformats are conventions used to indicate common types of content on a Web page, such as info on events, companies, products or reviews. If a standard could be agreed upon for Q&A pairs, would search quality be improved? They consider the solution in theory but are aware that the standardization and technical requirements of new microformats aren’t a quick or nimble fix.