Tuesday, December 22, 2009

Firefox 3.5 passes IE7 as most widely used browser

Firefox 3.5 has recently overtaken Internet Explorer 7 as the most widely used browser according to StatCounter. The belief is that more people are switching to Firefox because of its added speed and better adherence to widely accepted web standards.

What does this mean for website patrons? As browsers that poorly adhere to web standards, such as Internet Explorer, decline in use developers are able to utilize more advanced techniques (such a CSS3). These techniques empower developers to improve the end user experience by making sites more interactive and responsive. It also insures that web pages will look more uniform across a wider range of browsers without having to use special Internet Explorer "hacks". By minimizing the time developers spend on special-case formatting and tweaking more time can be spent on optimization and other techniques which makes sites faster and easier to use.

First Scribe continues to monitor technology trends in order to provide the best possible end user experience on sites we design. Contact us today for more information.

Thursday, December 3, 2009

When Large Businesses Game the System

There has been a fair amount of news recently about AOL's goal to re-brand its image from one that speaks to poor user experience, mediocre content, and a mailbox full of SPAM to... well I'm not exactly sure what they're transitioning to at the moment. If one were to listen to what current AOL CEO and former Associated Content co-founder Tim Armstrong has to say on the subject, it doesn't sound as if much has changed.

Over the past two weeks, details about AOL's new business model have been slowly revealed to the tech press and it looks as if Armstrong is hedging his bets on rebuilding AOL as a closed provider of in-network link SPAM and junk content.

It's not the freshest or most interesting idea, but it's simple enough to understand; use basic algorithms and automated systems to track the highest queried terms and then assign these terms to contract writers to produce topical content to drive traffic. The theory goes that if you can produce content that covers hot topics and then artificially increase article relevancy and visibility by utilizing inbound links from your own micro-sites, then your new content will be propelled to the top of search engine rankings and your traffic will increase with it. It's a content strategy that's three shades off from being completely honest, and it's one that spammers and malicious sites have been using for years in order to increase visits. It's also the exact model used by in-demand media companies like the aforementioned Associated Content to saturate the web with some of the dullest and most bounce-worthy information imaginable.

Some have already deemed AOL's new strategy as a long-term failure by simply assuming that the search engines will eventually massage out the inherent inadequacies that allow this sort of content to win out in the first place. But the truth of the matter is that junk content is still content, and so as long as the content on a page is coherent and readable, it still has a chance of winning out over far more interesting entries. Answers.com, eHow, and About.com are just three examples of sites that employ similar strategies to the one that AOL is attempting (albeit in a slightly better fashion), and they have yet to be penalized by search engines in any significant way.

This is actually going to be interesting from a search engine marketing standpoint; not only because it allows us to go back over and evaluate our own methodology, but also because AOL's high-profile approach to content creation will give us a window into what the search engines will or won't tolerate over the course of time. There's no doubt that AOL's venture into this area is due to a lack of policing by the search engines. In fact, the entire business model is reliant on the fact that search engines won't make any sweeping or sudden changes to their algorithms. The question now, though, is whether or not this marketing strategy is capable of long-term sustainability.

Tuesday, November 10, 2009

What is the future of search?

What is the future of search?

Search appears to be progressing into a split personality, depending on the type of data search being performed. At the moment we are using one tool (of visitor’s choice) to perform all search for data. That will continue to become more cumbersome as the data set available on the Internet grows.

1. Educational data – A significant amount of search is done in “stream of thought” for educational purpose. I apply education very broadly here as it could be a student researching Taiwan or a consumer completing research for a future purchase. Either way, this is neither life nor death, or is there an imminent purchase/decision to be made. The visitor is learning for future reference.

In this case the volume of data available on the Internet is an overwhelming burden. The searcher becomes bogged down in similarly-presented data with no guidance towards authority. The searcher is literally charged with finding data as well as discerning truth/authority.

I believe enhanced (matrix) search tools will begin to help this searcher find their way. Search tools will evolve additional relevancy based upon not only standards-driven user input but also statistical analysis of more finite pieces of data. We will see search results to data subsets versus the whole. No longer will the search be based upon an entire work so much as a piece of that work validated by user input with further statistical analysis of the search tool itself.

2. Consumable data – In this case I’m speaking of a visitor searching for data that will help them complete a timely (imminent) task such as location of a person, place, or thing. My expectation is that we will see a focus into highly-localized search (based upon known location) combined with some level of augmented reality tools.

The technology to ascertain the visitor’s location is already a simple matter. Resolving that to the data pertinent to that known location is only a matter of time. At the moment this data/location relationship is reliant on business and user data to be manually input and verified. It is only a matter of time before the processing and storage resources are applied to an intensive attack on this problem.

Once data is solidly tied to location, then some use of augmented reality will begin to pay off. A visitor will continue to search for a “keyword/thing” in a location and find a point of purchase for that item in a nearby store but computers will take it one step further.

I see visitors pointing their phone’s camera lens at an item and clicking a button. The image will be captured, combined with geolocation and proximity to data. Then the search engine will reply to the visitor with the question for refinement - “Do you want to know about that thing? Do you wish to purchase?”

Message Order Importance

Typically when building pages we see clients put together a list of links or information and put their most important information at the top, and the least important information at the bottom. This is a common practice for many sites, but Primacy and Recency Effects may cause a need to re-think this strategy.

A Primacy Effect is the chance that an item shown earlier in a message has a higher chance of being remembered by a receiver.

A Recency Effect is the change that an items at the end of a message has a higher chance of being remembered by a receiver.

Both Primacy and Recency Effects come into play, depending on a message receivers involvement with the message. Highly involved receivers (those who are actively reading your content) tend toward putting more weight on items earlier in the message. Low involvement receivers (those who are not processing the information in an active manner) will put more weight on items at the end of a message.

A quick overview of "Primacy and Recency Effects on Clicking Behavior" from the Journal of Computer-Mediated Communication:
Testing random order lists, where every item is the list is shown in each position over time, the most clicked item was the first in the list, while the second most clicked item was the last. This suggests that while some viewers are centrally processing the message, which leads to the primacy effect, there are slightly fewer people viewing peripherally, which leads to the Recency Effect.
This would suggest that when creating content, and choosing the order of the message/links/list that your most important information should be listed first, but that your least important should not be last. The last part of the message should be something that holds importance to receivers and can influence in a positive manner.

An example of this would be to perhaps not only put your main navigation across the top of your web site, but also to repeat that navigation at the bottom of the page to influence peripheral receivers to navigate the site.


Thursday, October 22, 2009

Google and Bing Fight for Content

It’s becoming clearer by the day that Google sees Bing’s presence as a threat to their service.

Because of Microsoft’s investment in improving Bing’s search algorithm and speed, the technology gap that once separated Google from its largest competitor has been narrowed significantly. Whereas years ago improvements to scalability, client-side usability, and relevancy of contextual search results helped push Google to the front of the pack, this noticeable technological disparity has slowly evaporated over time. Given Bing’s relative success when it comes to shopping and travel searches, as well as its recent deal with Twitter and Facebook to improve search results for these services (Google struck a similar deal just two hours later), it looks as if both companies are attempting to aggressively expand their feature-sets and the scope of content being queried.

The most recent development is Google’s partnership with streaming audio service Lala and the MySpace-owned iLike. The deal promises to bring streaming music to the Google page, allowing users to listen to a song once-thru for free or to pay for repeated listens (10 cents for an online-only version, and $1 for a downloadable mp3). The technical details are still a bit sketchy, but if Google manages to integrate this into their search in a substantial way (audio search, store search, improved MySpace search and caching, etc.), then they’ll have managed to set the bar higher for content search and potentially make up for any shortcomings in other areas like their previous attempts at integrating their search with online merchants.

Google clearly recognizes the level of competition that Microsoft brings to the table with Bing, and as time goes on it will be interesting to see where these two tech industry giants focus their efforts.

Tuesday, October 13, 2009

Is Google Indexing Flash Frames?

We have been running a test of Flash content inside Google for quite a while. Google started indexing text inside a Flash movie back in 2006. We never proved they were actually following action script links so we decided to put up the Flash test again and see if we pull different results.

We intentionally took the file down for 5 months to clear the cache out of Google. Now we posted it again with only a change to the date in the first file.

But are they indexing animated Flash Frames?

We have reason to believe that Google is actually indexing each frame in Flash. Our Flash developers are testing a Flash animation with 12,000 frames and a keyword at the end. It appears that Google has been banging away at the file with a vengeance but we don't see the keyword in search results pages to date.

We won't point to the test file in the blog until we can prove our hypotheses with certainty.

We should see results in the next few weeks if our last Flash in Google test has any hint to the outcome.

Tuesday, October 6, 2009

Optimizing your HTML code

I'm revisiting a few important topics from the web design realm this week. There are a few topics that keep rearing their heads so I can't help but react to the questions.

My previous post regarding design for the iPhone continues to be a hot topic and now we're off to optimizing your HTML Code.

I wrote a post about optimizing your HTML code back in May of 2007 and over 2 years later this continues to be a primary factor in every web design we create. As far as we are concerned, clean HTML code is of paramount importance for a few very good reasons, not the least of which is good will towards your visitors!

Obvious reasons to optimize your HTML:
  1. About 9% of adults are still using dial-up for their Internet connections;
  2. Google is a busy search engine;
  3. Bing is a busy search engine;
  4. W3C Validators don't like old code;
  5. The online visitor represents a fickle, impatient audience.
You simply must present a fast-loading, correctly coded website to each visitor. A large percentage of page visits will fall into the group of "less than 15 seconds" per page. We're talking about 35-45% of all your page visits will happen in less than 15 seconds.

Optimizing your HTML code:

The key to any optimization is to take a 2-phase approach to your work. The idea is to remove as much HTML code as is possible, leaving a high percentage of content to code.
  1. Centralize all formatting in linked style sheets.
  2. Centralize all javascript in linked files.
  3. Remove tables and use <div> layers for positioning.

This process will take some time but the dividends will pay off in spades. HTML code errors will drop to zero and your pages will load lightening-quick.

Thursday, October 1, 2009

Website Designs for iPhone Still a Question

Our First Scribe technology team uses our blog to talk about the latest trends in our industry. We try to point out the technology that has an impact on web design, development and SEO topics (or will in the near future).

That being said, this post is in regards to an old topic --

Websites for the iPhone

Back in June of 2007 we posted about the iPhone's upcoming release and the concern that client websites were not iPhone compatible. Of course, most pieces of the website are compatible save for Flash and Javascript. But the question is, "Will your website render as well as it could?"

By the way, if you think the iPhone is passe, remember that it only launched 2 years ago!

Here we are 2 years later and we are still talking to people about their corporate website's appearance in an iPhone browser. One of the most common concerns raised during our design sales cycle is the impact of a mobile version of the site (or just the stylesheets).

Lately we discussed the topic in-house. After 2 years, don't you think this topic would go away?

Design concerns for iPhone is still highly relevant

That old blog post happened to be one of the first iPhone/website pages to hit the market so it rules in Google searches on said topic. That old page still produces a major amount of traffic for us.

Why? Because it's still relevant.

In fact, a visitor from Apple.com landed in our website today due to a search for "build a website iPhone compatible."





If they care (or one person there cares) and we see 2 years of traffic on this same topic - then it's still relevant.

So what do we do?

Develop your site correctly. Build a website with the content separate from the formatting. In doing so a developer can create a set of stylesheets specific to the browser technology. You are essentially building a website optimized for the iPhone. The content is the same but the formatting is served dynamically for the mobile visitor.


If a visitor arrives to your website with an iPhone, the website will send them a stylesheet with imagery optimized for their use.


One additional item - be nice to your mobile visitors. Make your pages short to alleviate excessive amounts of scrolling.

Monday, September 28, 2009

Twitter Boasts... And is somehow worth $1Billion?

The latest factoids from Twitter staff:

1. The average Twitter user has 126 followers;
2. 20% of its traffic comes through the Twitter website;
3. it doesn't make any money.

Near as I can tell, this is the greatest business plan in history. They overstate follower numbers (#1), they over-imply the promise of advertising value (#2), and they can't come up with a business plan to produce a revenue stream(#3).

All that adds up to a valuation, by an outside source, of $1 Billion. Anyone here remember the dot-com bust of 2000?

Breaking down the boast

I'll break this down one at a time.

#1 - Over-stated followers

A few spin doctors on the Twitter staff have released numbers to us stating that the average Twitter user has 126 followers. How many followers do you have?

Chances are the number is close to 6.

According to this fantastic little study, the top 10% of Twitter users have so many followers as to render the rest of us useless. The top 0.1% of Twitter users has some 18,000 followers. The top 10% has over 450.

Do the math - 0.1% of 54.7 million users (in August 2009) gives us 54,700 users with over 18,000 followers. I'm going to go out on a limb and say that these people are skewing the numbers.

After all, in order to get into this study, you must be deemed an "Active" user. The mind-boggling criteria to be deemed "Active" you might ask?

..."users with at least five followers, five friends, or five updates."

#2 - Over-Stated Advertising Potential

If 20% of traffic is going through the website, then 80% is not. That means 80% of all the traffic runs through a non-advertising portal such as a smart phone.

How long do you think it will take for the 20% to shrink to 0% once advertising pros start gumming up the Twitter website with banner ads?

Haven't you ever stopped to think why there aren't any ads on that site yet? Please...

#3 - It doesn't make any money

They don't make any money. This simple truth is why we know that Twitter has been valued at $1B. It's because they are running out of venture capital. Because they don't make any money.

Twitter is great

Please don't misinterpret my ramblings to say that Twitter isn't a fantastic social media tool. It is. In fact, it's the meaning of life for many a Twitter user and that must be worth something.

But let's not get carried away here...

Thursday, September 24, 2009

Twitter Knows Where You Are!!!

It's true - Twitter knows right, exactly where you are. You Tweet, they jot down your location.

"Of course they know where I am."

You are using a computer or smartphone to update your Twitter account. It isn't terribly difficult to decipher some one's whereabouts when they connect to a website. A qualified application developer will either pull the information from your computer's IP address or from GPS/Geolocation data in your cell phone.

Who cares, right?

Well, according to a few techie-press sources we trust, they are starting to collect and save the data. The Twitter folks say they will only save the data for 14 days.

Regardless, saving data means it can be displayed. Or sold to the highest bidder. How much would you pay to know that a repeat customer just walked into your coffee shop?

Or better yet, how would you like to know that Bob Smith, a repeat customer, just walked into your coffee shop and he's aggravated with the long line at the counter?

The ultimate demographic data?

Think of the power of knowing some one's preferences AND their close proximity to your business. Specifically, your business in Minneapolis, Minnesota on the corner of 4th Ave. and 12th St.

"Hey Bob! Welcome! Say, sorry about the long line. How would you like a free coffee? Great!"

You just gained a very happy customer. Albeit Bob is probably freaking out about the Big Brother implications but it's a free coffee just the same.

He'll probably talk about both the free coffee and the Big Brother issues to his friends. They'll undoubtedly mention your store. And if Bob is like my friends, they will all go into that coffee shop, Twitter-phone in hand, Tweeting away about how great you are.

They're sold.

But it's scary just the same.

Bing takes search share from Google

Bing has started to take some search share from Google.

Yahoo has dropped from 20.1% in May to 19.3% in August. Google has dropped from a 65% share in May to 64.6% in August, which is only half of the market share loss experienced by Yahoo.

Microsoft, which started at an 8% share in May has increased to a 9.3% share in August according to the latest comScore search engine rankings.

The question remains to be seen if Bing's market share is from the advertising initiative and "I have to try it" factor, or if it's from actual customer satisfaction. If Bing's success is customer satisfaction then they will easily have a double digit share of the search market before the year is out. Satisfied customers tend to give great word of mouth advertising, which can only help Bing in the long run.

If Google continues to lose their market share at the same pace, while Bing grows at this pace, Bing should overcome Google around August of 2020. We'll be watching. (yes, we're aware this is faulty math)

Thursday, September 10, 2009

Does domain age matter?

Within the last year Google has finally gone on record to comment on the significance of a domain name's age and how it affects rankings in search engines.

The most recent instance occurred when Matt Cutts addressed this in the GoogleWebmasterHelp YouTube channel back in May:



Cutts also provided a similar glimmer of insight earlier in the year when he said:
To the best of my knowledge, no search engine has ever confirmed that they use length-of-registration as a factor in scoring. If a company is asserting that as a fact, that would be troubling.

So, yes, we do have two recent statements wherein Google specifically touches on the issue of domain registration length and its effect on search engine rankings, but (un)surprisingly both of these responses are fairly ambiguous. He isn't specifically saying that a domain's age doesn't have an effect on search engine rankings, just that it doesn't matter all that much when you look at the big picture.

If we were to take Cutts' statements for what they are, then we can infer that a three month old domain won't have any significant or noticeable competitive handicap when compared to a one or even two-year-old domain. A number of people have come out of the woodwork to argue against this point, and we've even begun to see a number of registration companies and domain squatters who are more than happy to use a domain name's age as a selling point.

But for many of the people who have shown good returns on established domains, their sites are often tied to high-quality, well-written content, as well as established and strong in-bound links. Regardless of whether or not age plays a large role in Google's algorithm, it's very rare for a brand new domain to have all of these things associated with it at launch. It takes planning, follow-thru, and more importantly time, to pull all of these things together. So it should come as no surprise that a site that has been actively updated and maintained for a year or two would hold an advantage over a newly christened site; there just wasn't as much time or effort put into the new site.

None of the factors that go into SEO exist in a vacuum, and a good portion of optimizing a site well often involves seeing the forest for the trees. While many people within the industry can argue the merits of holding on to established domains, the focus should never be specifically centered on one aspect. Now that we have an official word on this matter, one would hope that the subject could be dropped in favor of techniques that speak more to the substantive long-term and short-term needs of a site.

Tuesday, September 8, 2009

CraigsList Makes Top-50 Retailer List

The cover story for the STORES.org Favorite 50 online retailers list sees an odd newcomer to the ranks of perennial favorites. We're used to seeing Wal-Mart, eBay and Best Buy at the top of the list but the big surprise is Craigslist debut at No. 25.

Is Craigslist a Retailer?

Interestingly enough, Craigslist's characterization as a retailer is a topic of contention among many retail gurus. The site isn't actually a point of purchase website as we would require of a true online retailer.

Regardless, the STORES list of Favorite 50 online retailers is based upon a a survey of 8,600 customers in June, 2009. The customer is always right!

Friday, August 28, 2009

Bing Gains Ground - Sorry Yahoo

Google has held the lion's share of search volume for years now and it continues to grow. July numbers from StatCounter put the volume in the range of 77%. Yahoo follows with 11% and Bing is up a percent to 9.4%.

It would appear that Microsoft's huge advertising blitz (and a much improved product) have given them an upward growth curve. They're up 1.4% at the expense of Yahoo, which is down a percent.

The share of search dollars spent is also shifting up for Google, according to J.P. Morgan stats. Google captured about 72% of search dollars in the 2nd quarter of 2009. That's up almost 2% compared to the same time in 2008 - and the change is almost entirely at Yahoo's expense.

Mobile Search

The most surprising numbers come in the area of mobile search where Yahoo is strong at 34% of the the search market in the U.S. Google holds about 63% of the market, according to ComScore data.

Display Advertising

Where does Yahoo still outperform the others? Display advertising.

Yahoo controls 13% of the display ad views on content-network sites. Microsoft and Google lag behind at 5.5% and 1.3% respectively.

Monday, August 24, 2009

Nick Passes Google Analytics Individual Qualification (GAIQ)

We are proud to announce that every member of our Minneapolis SEO team holds the Google Analytics Individual Qualification (GAIQ). Nick Davis joins the ranks of Ken Kralick and Dan Epley among the holders of this prominent certification.

Google Analytics is the tool of choice for many beginning web analytics users but most quickly hit a glass ceiling of functionality without proper training. We strongly believe that the true power of Google Analytics can rival many of the expensive web analytics tools on the market if you have the know-how to unlock Segmentation, Filters, and Event tracking.

GAIQ Certification

The Google Analytics IQ certification is proof that an individual holds a proficiency in the Google Analytics advanced tool set. It provides solid assurance to a client that the the "analytics pro" actually holds training and experience in the tool.

Monday, August 10, 2009

Ken Passes Google Analytics Individual Qualification (GAIQ)

We are proud to announce another certification to our SEO team! Ken Kralick has joined Dan Epley's rank among the rank and file with a Google Analytics Individual Qualification (GAIQ).

Google Analytics is intended to be a great tool for a new user but there are advanced functions buried inside the software. This advanced functionality in Segmentation, Filters, and Event tracking give the free analytics tool a power to rival the expensive web analytics tools on the market.

GAIQ Certification

The Google Analytics IQ certification is proof that an individual holds a proficiency in the Google Analytics advanced tool set. It provides solid assurance to a client that the the "analytics pro" actually holds training and experience in the tool.

Experience and Training make the difference

Wouldn't you like to know how Pay Per Click (PPC) visitors are interacting with your website? Do you think the PPC visitors do the same thing on your website as the organic search visitors? What about repeat visitors? Can you filter out internal company traffic? From all your branch offices?

All of this is simple to track in Google Analytics but you need the training and experience to make it work. The Google Analytics Individual Qualification (GAIQ) assures clients that we have that knowledge.

Many Certifications

Our SEO team holds certifications across the many facets of our services. We believe that industry certifications are important and they help us stand up in the crowd of competition. If nothing else, it proves that we are willing to put our money where our mouth is, pass the industry tests and keep the certifications current.

Thursday, August 6, 2009

New Google Chrome Beta is Faster

Our main complaint with the existing Google Chrome browser was the occasional (and unexplainable) lag during start up and while browsing. We found it to be enough of an irritation that everyone in the office returned to FireFox or Internet Explorer.

The new beta release may change those choices:

According to Google engineers, the latest "beta release shows over 30% improvement on both the V8 and SunSpider benchmarks over our current stable channel release."

It's hard to scoff at a measurable gain of 30%.

Wednesday, August 5, 2009

Flash vs. SEO

We've been testing Flash content in the search engines for more than 3 years. There's no surprise here - in our findings Google can read the content (sometimes) but HTML is best in all our tests.

So, the big question remains - how do you get that fancy font to appear in HTML?

You have 2 choices -

1) Place the image on the visible area of the page and tuck the text in a div layer. Either place the div layer under the image or shove it off the page with a -5000 x or z axis placement. If this sounds treacherous, it is. This is an old trick and Google can see right through it. You will be undoubtedly deemed suspicious and your SEO rankings may falter.

2) Word from the Google Webmaster Blog folks suggest using a script that calls Flash to alter the display of the content. This practice displays the same content to visitors and the Googlebot alike. A win-win as far as we're concerned.

From the blog:

"sIFR: Some websites use Flash to force the browser to display headers, pull quotes, or other textual elements in a font that the user may not have installed on their computer. A technique like sIFR still lets non-Flash readers read a page, since the content/navigation is actually in the HTML -- it's just displayed by an embedded Flash object."

We use sIFR on most of our marketing sites. The new First Scribe home page uses sIFR in the top navigation buttons. We present the proper typeface while also enabling optimized text links to the top-level pages.

Friday, July 31, 2009

I have to be on Twitter ... don't I?

From @mitchjoel
Bad:What should we do on twitter?
Good:Why should we be on twitter?

I think this sums up the Twitter conversation going on in many businesses. Twitter is all over the news, in many circles of conversation, and people want to know how they can use it to improve their businesses.

The question I hear most from people is What should they do, but as @mitchjoel states, it's not what, but why. Are you going to be able to use Twitter to actually enhance your product? Improve customer service? Engage in conversation? Or is your only purpose to one way market?

My personal opinion is people first think of Twitter as an easy one way marketing tool like a billboard or newspaper ad. This is a failure of using Twitter to its full potential.

By figuring out the why, before the what, the chances you'll turn Twitter into a useful tool to enhance your business success will increase.

Plus, you might find out you don't need to be on Twitter and save yourself the effort of figuring out what to do.

Thursday, July 30, 2009

Microhoo! - The Microsoft/Yahoo! Search Partnership

Microsoft(Bing) and Yahoo! announced a search partnership to begin in early 2010. You can read more at CNN Money. This partnership will put Bing search results from between 25-30% of total market share. Yahoo will still be responsible for attracting premium advertisers, no word yet direct effect this will have on PPC advertisers.

The search partnership has great timing as Bing continues to spend its way to popularity. This should lead to another swing of talk about the, now more interesting, Bing vs. Google competition. Although Google is still king, and will be for the foreseeable future, this is the first real competitor (as far as search share goes) they've seen this decade.

With Bing now a substantial piece of search results, will PPC costs in Google begin to become cheaper as advertisers shift budgets around? Or will advertisers even bother shifting budgets any more than they already have? Will the relatively low PPC costs of Yahoo! and Bing begin to cost more?

Plenty of questions around this merger, should make for an interesting 2010 for SEM.


SEO and Social Media Don't Mix

Given the immense growth of social media sites over the past six years and the importance of link building in search engine marketing, it's not surprising that one common and flawed question continues to come up every few months: "How much of an effect do social media sites have in determining our website's visibility and PageRank?"

The simplest answer is: little to none. Generally speaking the largest social networking and social bookmarking sites provide very little utility when it comes to link building. While many sites such as MySpace, Facebook, Del.icio.us, YouTube and even Flickr once allowed unfettered direct linking, nowadays many social media sites have been obfuscating off-site links with page forwards, frames, and the more common "nofollow" anchor tag restriction. These small changes have essentially nullified any previous and current efforts to improve a site's Google PageRank via social networks.

Naturally, from an SEO perspective, this brings us back to square one. If it isn't possible to effectively utilize social media to increase your website's visibility, then what inherent value does it even have?


One of the best approaches is to think of your social media pages as being micro-sites for your preexisting base of customers; they can serve as convenient locations where you can provide former and current customers with content that's fresh, interesting, and
immediate. Whereas you would normally apply a long-term strategy of attracting new customers to your company's website in the form of competitive SEO, a Facebook page is the perfect place to connect with current customers in order to create short-term conversion opportunities in the form of product announcements, discounts, and important company news.

That's just one example, of course. But it is one of the many opportunities that social media can provide your company if you approach it as a unique, social tool and not simply as a means for improving your PageRank.

Tuesday, July 28, 2009

Verizon to cut 8,000 jobs

Multiple news agencies reported that Verizon plans to cut 8,000 jobs to offset a 21% decline in quarterly profit. In the last 5 months AT&T has cut about 12,000 and Sprint Nextel has pared down another 8,000 people.

That being said, is this the final gasp of the telecom bust of 1999-2000?

Could it be that the historic, copper-telephone cable between us is in the throws of a final cut as the US moves to Voice Over IP connectivity? One thing is for certain, cutting 28,000 jobs in the telecom market doesn't bode well for 3rd quarter employment numbers.

Monday, July 27, 2009

New First Scribe Website

The First Scribe team is proud to announce the launch of the new web design for www.FirstScribe.com. The design is 2 months in the making so we are happy to see it live!

Check it out at FirstScribe.com!

The new design focuses attention around our three main service areas:
  1. Web Design
  2. Web Development
  3. Search Engine Optimization
Content silos were created to organize visitors to each area of expertise. Please email us with your feedback - we are always happy to receive constructive criticism.

Thursday, July 16, 2009

Microsoft Bing! Bomb? Boom?

Here's a very quick summary of events, just in case you haven't been following the soap opera courtship between Microsoft and Yahoo the last 2 years: Google is kicking their butts in search.

According to Comscore numbers, Google holds roughly 64% of all Internet Searches. Yahoo and Microsoft want a ticket to the ball but their search algorithms have long lagged behind Google's stronger ability to serve relevant results.

Enter Bing!

Microsoft decided to take the bull by the horns and write a new search algorithm - named Bing! The new engine launched the last days of May and reviews are in. They're mixed.

At First Scribe, we are relatively indifferent. The Bing engine seems to work, although it appears to fall for some old SEO techniques in the area of keyword-loaded directory structure. We are most impressed with the persistent search history feature and the image search but that doesn't seem to be enough for my staff to be using Bing any more than Google.

The Big Budget

Multiple news sources have stated the marketing budget for the launch of Bing between $80 and $100 million.

The payoff?

With a month of the open market behind Bing, Comscore is reporting that Bing received 8.4% of the Internet search queries in June, '09. Up from 8% in May. Google sat flat at 65% and Yahoo dropped from 20.1% to 19.6% in the same time frame.

What's the ROI?

It's always difficult to measure an ROI from such a broad-reaching marketing plan as this. However, we can tell you that many of us were looking for Microsoft's share to break the 10% range of search market share and they fell short of that expectation.

We have a suggestion -

Maybe Microsoft would have done well to spend a portion of the budget on a Google Adwords campaign.

Huh! Looks like they did. I wonder if they use Omniture Analytics:

Tuesday, July 14, 2009

God More Popular Than MSN?

July 6, 2009 marked an important date for Christians when the world's oldest bible went online. The Codex Sinaiticus dates from the 4th century and it is arguably one of the most important texts in all of Christianity. The book contains nearly 800 of the original 1,460 pages, including roughly half of the Old Testament and the entire New Testament.

Pages had been separated from the whole and traveled to institutions across Europe. Starting in 2005, the project coordinators collected the existing pages and scanned each to a digital format. The composition became available online last week.

Instant Popularity

Now, the kicker -

The Codex Sinaiticus received 96.4 million hits in the first 48 hours online.

With some simple math we can extrapolate a number for the entire month. Please know that we recognize that we are taking phenomenal liberties in our analysis so these numbers are dubious at best. But let's not forget that we are talking about God on a Technology blog...

With a little math, presto - 96.4 million in 48 hours converts to 1.45 billion in 30 days

For comparison purposes:

Monday, July 13, 2009

First Scribe Hiring Again!

There isn't a rock big enough to hide under if you are hoping to avoid the economic climate this year. That being said, First Scribe continues to thrive as a company, hiring our 10th member of the team last month.

Welcome Ben Koren to our Web Development team!

Ben and his wife Amanda recently moved back to the Twin Cities, where they grew up. Prior to moving here, Ben studied Computer Science at Winona State University. It was here he found his passion for web development and server administration.

In his spare time, Ben enjoys time outdoors with his wife, video games, cooking, and home improvement.

Thursday, July 2, 2009

Analytics Certifications

As marketing budgets become more scrutinized there is an increased need to stretch every dollar for optimum performance. Stretching that dollar is a topic by itself, but in order to find out how far that dollar goes we need to have proper analytics in place. Whether that solution is Omniture, or Google, First Scribe is able to cover both avenues.

Dan Epley recently completed the Omniture Certified Professional: SiteCatalyst and Google Analytics Individual Qualification certifications ensuring that whichever package is chosen we're prepared.

Omniture is the industry-leading product that provides web sites with one place to measure, analyze, and optimize data from all online initiatives across multiple channels.

Google Analytics is a web analytics solution that gives you insights into your website traffic and marketing effectiveness. With Google Analytics, you're more prepared to write better-targeted ads, strengthen your marketing initiatives and create higher converting websites.

Friday, June 19, 2009

First Scribe is Moving!

First Scribe is proud to announce that we have outgrown our space in Golden Valley and we are on the move! We will close our office at 3pm today and open in our new space Monday morning.

Our new address:

110 Cheshire Lane
Suite 105
Minneapolis, MN 55305

Tuesday, June 2, 2009

Social Media and Facebook

Facebook recently announced topping the 200 million users mark, surpassing many of the big traffic giants on the Internet. Google, Yahoo and MSN still claim more users but none of them have the staying power of social media.

What does it mean?

In a few short words: You probably already know what it means. That's the whole idea of social media.

The allure of Facebook, Twitter and Myspace is that they intended to mean something different for every user. Members upload photos; post links to their favorite websites; and update their own daily (or hourly) status.

Why do we care?

Facebook, Twitter and Myspace are all .COM sites. That is to say that they are all commercial ventures in search of a profit. They appear to be doing a good job on their part - now how do we monetize the situation without - and here's the kicker - without becoming a hindrance to the social media process.

There is obviously a great deal of traffic through these websites and where there's traffic, there's marketing potential. However, we're talking about social conversations (ostensibly) and interrupting that conversation with a negative distraction is tantamount to the pop-up ads of old.

How do we do it?

This post is the tip of the iceberg of conversations to come. We have a few, quick ideas for you to get started and we will circle back to this conversation over the next few months.

  1. Start your own profile and take a look around at the ads around the site. If it annoys you, it will annoy others.

  2. Search for your favorite brands, stores, and activities. Do they have a "Fan's Page"?

  3. Look at your friend's profiles. Are they pointing to favorite pages? Are they linking to product postings, notes, events, or videos?

  4. Are your friend's talking about the same?

Inserting yourself into these areas is tantamount to the old-fashioned way of direct marketing. Handing people a print ad on the street, as it were.

Instead of yelling into the wind, you and your company need to start listening to the voices. Interacting with the group.

It's a brave new world out there and your old Marketing Manual doesn't work in this space.

Friday, April 10, 2009

Yahoo and Microsoft Back To The Table?

It appears that Yahoo and Microsoft are back to the table talking about a search partnership. The talks appear bereft of discussion for a takeover and the general news surrounding both companies appear to support the death of any takeover bid.

Yahoo's new CEO, Carol Bartz, has joined ranks after the multiple takeover bids and doesn't appear to display any of the baggage from the multiple battles between the two companies in 2008. Microsoft appears to be moving on with their new search engine development (internal name Kumo).

What does it mean for the search world?

From our Search Engine Marketing perspective, we are happy to see Microsoft and Yahoo speaking in terms of combating Google's huge market share (multiple sources put them around 63%). Whether or not they can actually perform in this arena is questionable but we prefer they state that they will attempt to do so...

Until then, track your PPC numbers closely and diversify where you are able.

Wednesday, March 11, 2009

Craig Davis is a Google Adwords Qualified Individual


Craig Davis of First Scribe Inc. has been using Google's Pay Per Click advertising service, Google Adwords, to varying degrees for roughly 4 years. In order to further his knowledge of the Adwords system, he recently studied for, and handily succeeded in passing, the Google Advertising Professional Exam to become a Google Adwords Qualified Individual.

The Google Adwords certification exam is an exhaustive test covering all aspects of the Adwords system. Topics include account creation, billing, administration and optimization. The intent is to ensure marketing professionals have a complete understanding of the Google Adwords platform. With this knowledge a Google Advertising Pro can create and administer high-performance Adwords campaigns.

Craig is a valuable member of the First Scribe Search Engine Marketing team. In his tenure he has managed many of our most prominent SEM accounts. We are proud to have another team member among the ranks of the Adwords Qualified Individuals.

Monday, February 23, 2009

Nick Davis a Google Adwords Qualified Individual


Nick Davis of First Scribe Inc. has been using Google's Pay Per Click advertising service, Google Adwords, for roughly 2 years. As part of his ongoing development at First Scribe, he recently passed the Google Advertising Professional Exam to become a Google Adwords Qualified Individual.

This exam is a comprehensive test covering all aspects of the Adwords system from account creation and administration through to successful campaign management and optimization. The training and exam also cover the use of Google Adwords to measure Conversion and Return on Investment from Pay Per Click advertising.

Nick has done well to take the training, internalize the techniques, and then handily pass the certification exam. Nick is a valuable member of the First Scribe Internet Marketing team and we are proud to have another team member among the ranks of the Adwords Qualified Individuals.

Friday, January 9, 2009

First Scribe Hosting Infrastructure Moves to SAN Storage

In the first week of 2009 First Scribe completed the migration of our web, mail, and database servers to a SAN (Storage Area Network) based storage solution. This provides faster performance, better reliability, and almost instantaneous backup and restore capabilities to our already solid hosting platform. The SAN based storage will help us maintain the high level of service our customers have come to expect as our business and infrastructure grows. The SAN storage solution coupled with our currently deployed virtualization infrastructure insures our clients' sites are hosted on a locally managed, cutting edge platform that provides the necessary reliability and performance to succeed in today's competitive online arena.