Trending February 2024 # Facebook Visited More Than Google In 2010, Traffic Analyst Firm Says # Suggested March 2024 # Top 4 Popular

You are reading the article Facebook Visited More Than Google In 2010, Traffic Analyst Firm Says updated in February 2024 on the website Daihoichemgio.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested March 2024 Facebook Visited More Than Google In 2010, Traffic Analyst Firm Says

Facebook Visited More Than Google in 2010, Traffic Analyst Firm Says

To be “bigger than Google” in the tech world is a pretty big feat. But for Facebook’s Mark Zuckerberg, it’s something that he can officially mark off of his “to-do” list, if he’s got one. According to Hitwise, a traffic analyst firm, the social networking site has officially surpassed the search giant as the most visited website in 2010, and by a pretty significant margin, no less. These numbers, though, reflect traffic based in the United States only, though.

Hitwise has reported that Facebook accounted for a total of 8.93 percent of all Web visits in the United States, between the months of January and November. Google on the other hand, which is still ranked high on the chart at number 2, obviously, managed to nab “only” 7.19 percent of the US’ total Web visits between the same period. There’s a few honorable mentions to go around, though.

For one, Yahoo! took two spots on the rankings, with Yahoo! Mail grabbing the number three spot. Yahoo!’s main search engine snatch the fourth spot. And finally, happy to have a spot in the top 10 (we’re sure), is Microsoft’s decision engine, Bing. After winning Person of the Year, and now this, Zuckerberg certainly has quite a bit to celebrate going into 2011.

You can read the full press release below for more information, including more details about most searched-about products and people.

Press Release

New York, N.Y., Dec. 29, 2010 – Experian® Hitwise®, a part of Experian Marketing Services, has analyzed the top 1000 search terms for 20101 and Facebook was the top-searched term overall. This is the second year that the social networking website has been the top search term overall, accounting for 2.11 percent of all searches.1 Four variations of the term “facebook” were among the top 10 terms and accounted for 3.48 percent of searches overall.

The term “facebook login” moved up from the 9th spot in 2009 to the second spot in 2010. YouTube was the third most-searched term in 2010, followed by craigslist, myspace and chúng tôi Analysis of the search terms revealed that social networking-related terms dominated the results, accounting for 4.18 percent of the top 50 searches.

When combined, common search terms – e.g., facebook and chúng tôi – for Facebook accounted for 3.48 percent of all searches in the US among the top 50 terms, which represents a 207 percent increase versus 2009. YouTube terms accounted for 1.12 percent, representing a 106 percent increase versus 2009. Aol search terms accounted for 0.34 percent of searches in 2010, but grew 22 percent versus 2009. Google terms accounted for 0.63 percent, and Craigslist terms accounted for 0.62 percent.

New terms that entered into the top 50 search terms for 2010 included – netflix, verizon wireless, espn, chase, pogo, tagged, wells fargo, yellow pages, poptropica, games and hulu.

Top-visited Websites in 2010

Facebook was the top-visited Website for the first time and accounted for 8.93 percent of all U.S. visits between January and November 2010. chúng tôi ranked second with 7.19 percent of visits, followed by Yahoo! Mail (3.52 percent), Yahoo! (3.30 percent) and YouTube (2.65 percent).

The combination of Google properties accounted for 9.85 percent of all U.S. visits. Facebook properties accounted for 8.93 percent, and Yahoo! properties accounted for 8.12 percent. The top 10 Websites accounted for 33 percent of all U.S. visits between January and November 2010, an increase of 12 percent versus 2009.

Other top searches from various categories include:

Personality – top 5 people searches

1. Kim Kardashian

2. Oprah

3. Rush Limbaugh

4. Miley Cyrus

5. Glenn Beck

Movie Titles – top 5 searches from within Movies category:

1. Star Wars

2. Paranormal Activity 2

3. Avatar

4. Transformers 3

5. Harry Potter and the Deathly Hallow

Music – top 5 searched for artists/bands:

1. Lady Gaga

2. Justin Beiber

3. Eminem

4. Taylor Swift

5. Michael Jackson

Branded Destinations top 5 search terms:

1. Disney World

2. Disneyland

3. Six Flags

4. Universal Studios Orlando

5. Great Wolf Lodg

Top TV show searches from Television category

1. Dancing with the Stars

2. American Idol

3. Young and the Restless

The top generic search term was “hulu” within Television category

Sports – the top searched for athlete was Tiger Woods and the top sports team was the Dallas Cowboys from within the Sports category.

News and Media – the top searched for person was Bret Michaels followed by Tiger Woods and Sandra Bullock within the News and Media category in 2010.

You're reading Facebook Visited More Than Google In 2010, Traffic Analyst Firm Says

Google On Dealing With Low Traffic Pages

Is Pruning Content a Good SEO Strategy?

The question asked was whether it was worth it in terms of SEO to prune content that was not performing. By “performing,” the questioner clarified that he meant content that was not receiving traffic. He also clarified that it was research content that was not adding value to site visitors.

The person asking the question did not mention the concept of content cannibalization. But the outlines of what he was asking conforms to the theory. More on that below.

John Mueller suggested that pruning content was just one approach. There are other strategies to use as well.

How to Deal with Low Traffic Web Pages

John Mueller offered two strategies. He said that the first option was to remove the content. The other option was to improve the content. Either approach could be satisfactory. He then cautioned that it may not be a good approach to use Page Views as a metric for determining what is good or bad content.

The second approach, to improve the content, contradicts the long held rote solution to remove non-preforming content. We’ll get back to that later. Let’s see what John Mueller actually said.

John Mueller had this to say about content pruning:

Like, if they have this content on the website and it was initially there for a reason; and maybe it’s not great content, maybe it’s even bad content… one approach is really just to say ok, we will spend time to improve this content.

And the other approach might be from a practical point of view where you say… I know this content is there and I put it out there for a reason but it’s really terrible content and I don’t have time to kind of improve this, I don’t have time to focus on this. Then maybe removing it is a good idea.

Ultimately it’s something where the content that you have available on your website is how you present yourself to search engines. So if you’re aware that this content is bad or low quality or thin, then that’s still how you’re presenting yourself to search engines.

And you can say I can handle that by removing the bad content or I can handle that by improving the bad content. And both are appropriate responses that you can take.

Sometimes there are practical reasons to go one way or the other way. For example if you have millions and millions of pages that are really thin content then maybe it’s not practical to improve all of those. Then maybe it’s something where you say well in the long run I’ll make sure that my new content is good and then you take all of those out. Or it might be that you find a middle ground and say well I’ll improve some but I’ll also take a bunch out that I don’t have time for or I don’t want to have it at all on my website.”

Page Views Can Be an Unreliable Performance Metric

Another participant then brought up the example of a web page containing important and accurate content that didn’t receive traffic because it wasn’t a popular topic at this particular time. In other words, the topic was not trending in any way.

John Mueller responded that this was an important nuance.

“I would not use a metric like page views as the only way of recognizing low-quality content. You’re kind of the expert of your website, you know what’s good and what’s bad. Sometimes a metric like page views helps you to find low quality content.

But I would not blindly say everything that gets few page views is bad content, I need to remove it. …Our algorithms do not look at the number of page views. They try to understand the value of the content…

Just because it’s rarely viewed doesn’t mean it’s a bad piece of content.“

What is Content Cannibalization/Keyword Cannibalization?

The problem with the theory is that the phrase, Content Cannibalization, is just a label that does not describe the actual issue.

It’s like if your mechanic tells you that your car doesn’t run because it has engine trouble. How does that help you?

The phrase “engine trouble” is just a label that can describe anything from a blown gasket to an alternator that needs replacing. Similarly, the phrase “Content Cannibalization” does not tell you what is wrong. An alternator that needs replacing is more specific and gives you an idea of possible solutions.

And that’s why the person in the Webmaster Hangout had the question. He didn’t know what he should do because he didn’t understand the problem. You can’t find a solution to a problem you don’t understand. Calling it Keyword Cannibalization does not help anyone.

What Content Cannibalization Really Is

What’s really the problem is usually pages that are thin content, irrelevant content, duplicate content, outdated content and content that is simply long tail.

As John Mueller stated, using page views as a metric will get you in trouble because you’ll remove perfectly good content.

The other issue that is being fixed is site architecture. A good site architecture helps the different sections of a site snap into focus in terms of what the topic is about. Those might not be the only issues, but those are the ones that immediately come to mind.

It is unhelpful to discuss these issues under the label of Content/Keyword Cannibalization. It’s more helpful to specifically identify what is going on:

Duplicate content

Thin content

Irrelevant content

Outdated content

Long tail content

Site architecture

Those are six different issues. Most importantly, they are not issues because they are “cannibalizing” keywords or content.

They are problems for very different reasons. Each problem demands a different solution and “pruning” the content is not the only solution. As John Mueller suggested, you can remove the content, you can update the content or you could leave the content alone.

Watch John Mueller discuss content pruning in the webmaster hangout.

Screenshots by Author, Modified by Author

Google Search Console Tutorial: Analyzing Traffic Drops

In a YouTube video, Google’s Search Advocate, Daniel Waisberg, offers valuable tips on quickly spotting and analyzing the reasons for a decline in Google Search traffic.

The timing of this informative guide is perfect, as Google just wrapped up its March 2023 core algorithm update. Many people are now evaluating its impact on their websites.

If you’re trying to figure out how the update has affected your site, the Search Console Performance report is an excellent starting point.

Waisberg demonstrates how, when combined with Google Trends, the Search Console Performance report can help you investigate shifts in traffic patterns.

Main Reasons For Organic Traffic Drops

There can be several reasons for a drop in organic traffic. Waisberg highlights these main causes:

Technical issues: Errors that prevent Google from crawling, indexing, or serving your pages to users. These could be site-level or page-level technical issues.

Manual actions: If your website doesn’t follow Google’s guidelines, some pages or the entire site may be less visible in Google Search results.

Algorithm updates: Core updates may change how some pages perform in Google Search over time, leading to a slow decline in traffic.

Search interest disruption: Changes in user behavior or external influences could affect the demand for certain queries.

Seasonality effects: Regular traffic fluctuations due to weather, vacations, or holidays.

Reporting glitches: Sudden major changes followed by a quick return to the norm could indicate a simple glitch.

Analyzing Traffic Drops Using Search Console Performance Report

The Search Console Performance report is an effective tool for understanding traffic fluctuations.

To access the Performance report in Google Search Console, follow these simple steps:

Waisberg suggests several ways to analyze the data:

Expand the date range to 16 months to view the drop in context and identify any patterns or trends.

Periodically export and store data to access more than 16 months of information.

Compare the drop period to a similar period (e.g., the same month last year or the same day last week) to pinpoint the exact changes.

Explore all available tabs to determine if changes occurred only for specific queries, pages, countries, devices, or Search appearances.

Ensure you compare the same number of days and preferably the same days of the week.

Analyze different Search types separately to understand if the drop was limited to Search, Google Images, Video, or News tab.

Using Google Trends For Industry Analysis

Google Trends provides insights into web, image, news, shopping, and YouTube search trends.

Waisberg recommends using it to:

Analyze general trends within your industry or country to identify changes in user behavior or competing products.

Segment data by country and category for more relevant insights into your website audience.

Examine queries driving traffic to your site for seasonal fluctuations or trends.

In Summary

Understanding the reasons behind Google Search traffic drops is crucial. Using the Search Console Performance report and Google Trends, you can identify and analyze the causes of these drops, helping you stay ahead of industry trends and maintain your online presence.

In his next video, Waisberg will explore more ways to analyze search performance, including using a bubble chart.

Featured Image: Screenshot from YouTube, March 2023. 

Source: YouTube

How To Use Traffic Analytics For More Hyper

Just about every business is looking for better ways to connect with their customers. Customer engagement is the key to conversions and loyalty. According to one report by Super Office, businesses with higher engagement rates had an average retention rate of 89%.

Download our Free Resource – Google Analytics Fast Start – 10 mistakes to avoid

This guide in the Smart Insights ‘Fast Start’ series gives you a checklist of issues to review to improve your implementation.

Access the

The only way to target audiences is by applying data in the proper way so that customers are viewing content that is personalized to them. Unfortunately, 70% of marketing teams report they do not use behavioral data to target their customers. Often, this is because they simply do not know how to collect the necessary data or apply it.

Gathering and utilizing consumer data does not have to be a complicated process. In fact, one of the best data sources that businesses can use for effective targeting comes from website traffic analytics.

Let’s discuss some ways that all brands can leverage this information for successful and hyper-targeted marketing.

Utilize a specialized analytics resource

Google Analytics is often the primary source of website traffic data that businesses rely on. While Google Analytics is certainly a good starting point, you are going to gain much better and deeper insights from tools that offer more features and better processing systems.

Google Analytics provides a good snapshot of the general areas where website traffic is arriving from, in terms of geographical location as well as the online links that drove them to a website.

However, to get a 360-degree view of your online audience, you should be using specialized analytical programs, such as Finteza. This program offers real-time data through interactive reports that measure multiple metrics as well as background information on each customer. Through specially coded links, you can track specific customers and key behavioral data, such as how many times they have visited your website, which link led them there, and which products they have viewed.

Get detailed with segmentation and sequences

Data from Google Analytics, or another analytical tool, will need to be utilized here to create audience profiles based on behavioral and demographic data. It is best to start with generalized segments and work your way down to more and more specific sequences.

Let’s say for example that you sell pool toys through an online store. You may want to start by segmenting your audience based on their location. Customers in warmer locations or nearby beaches would likely have a higher interest than customers in colder climates.

From there, you may want to narrow down even further to adults in their mid-twenties to early forties. These people are most likely to have children that would be interested in the toys. Additionally, you can segment again based on variable conditions. This might include whether or not they follow your brand on social media, how many times they have visited your website, or whether they prefer to shop on a mobile device.

Know exactly which metrics to track during test periods

Generally, when we think of a successful marketing campaign, we equate it to higher sales. However, if conversion rates do not grow right away, it is not necessarily an indication of failure. As you start to hyper-target audience segments, be sure that you know which metrics are markers of campaign success.

While conversions may not necessarily grow at first, other areas of change could signal success, such as CTRs, time spent on webpages, social followings, international traffic growth, and so on. It is helpful here to use a KPI tracking system that monitors the most important metrics and tracks changes.

Conclusion

Target-based marketing is going to be a process of trial and error. You won’t nail your campaigns on the first go (or probably the second or third). The only way to know the impact that your efforts are making is by tracking the results. You will need to utilize a system that helps you pinpoint the metrics that are changing.

Customers want to feel special. They expect brands to cater to their needs and desires.

One of the best ways that businesses can do this from the start is through targeted marketing – but only if they are using data to fuel their campaigns. By understanding how to properly analyze and apply traffic behavioral data, companies can optimize their approaches and target customer segments more effectively.

Robotics Is More Than Just Automation

While numerous robots are intended for automating tasks, there are others that are intended to augment human capabilities rather than automate tasks. Robots are good at performing repetitive tasks on factory floors today, however, cutting-edge robotics will go past automation use cases. The next challenge is someplace in the middle of where we need to have robots that can settle on decisions all alone independently, yet additionally, have the option to have people tuned in. Automation is obviously a top reason behind organizations considering the utilization of robotic innovation within their business processes. Nonetheless, it is essential to understand that, while numerous robots are intended for automating tasks, there are others that are designed to increase human abilities as opposed to automate tasks. We will, in general, consider robots either robotic arms or autonomous mobile robots operating autonomously in business settings. Such devices regularly center around improving productivity and effectiveness in business operations. Then again, there are a few components of robotic technology that are centered around improving human security or giving humans increased strength, stamina, or accuracy. By including only a couple of robots one after another, smaller producers are setting up a triumphant establishment for giving the individualized offerings and services customers expect. All the more critically, they are cultivating a working environment culture that improves the ability and aptitude of each employee. One of the most clear utilizations of robotic technology that is intended to expand human abilities as opposed to automate tasks is in the region of surgical robotics. These devices are intended to be physically worked by a surgeon while expanding the abilities of the surgeon. One of the most broadly known instances of robotic technology for surgery is the da Vinci robot by Intuitive Surgical. The innovation permits a surgeon to remotely work the robot utilizing a 3D top quality vision system as well as various automated arms, each cable of being equipped with an alternate surgical tool. Getting robots to the business procedure doesn’t really imply that employees will vanish. Quite the contrary, job titles and duties should be rethought and, in all likelihood, raised. Via automating monotonous and potentially hazardous tasks and giving the data and abilities required to get work done well, employees can be retrained to work alongside robotics and do their jobs better and safer. Robots will work together with people to unravel immediate social problems across domains, for example, industrial safety, healthcare and disaster relief. In industrial safety, for instance, robots can be deployed in distant areas to maintain public infrastructure. According to Vijayakumar, Director of the Edinburgh Centre for Robotics, “We face huge difficulties with regards to keeping up our foundation, for example, underground sewerage systems. We will find a good pace where it is genuinely inconceivable for us to take a look at the safety of these things.” In any case, it could set aside some effort to arrive even with the current pace of technological development, on account of the vulnerabilities of this real world, and the pervasiveness of noise and sensors in decision-making. “All things considered, we need to truly find a workable pace where we can exploit the best of the two universes,” he said. “Robots are truly adept at performing exact movement, while people are generally excellent at contextual decision-making.” Robots embedded with sensors permit the provider to screen the output capacity which opens up a completely progressively affordable approach to optimize working capital. As opposed to making a one-time, heavy buy and pursuing a month to month charge for on-call service, companies can use a service-level agreement with the provider that permits usage-based billing without owning the actual robot. Affordable, connected, intelligent, and adaptable robotics are opening the door to an unprecedented opportunity for small and midsize manufacturers. However, despite the fact that accessing and actualizing the innovation may appear to be clear, automating everything is never the appropriate answer. Costs will step by step increment to wild levels, particularly when one piece of the procedure breaks and triggers a glitch down the whole line.

Bloggers Are Worth More Than Their Links

Everyone wants to rank across all search engines for their desired terms.

At a certain point in the evolution of Google search, it became apparent that naturally acquiring links (such as those from bloggers) helped with that goal.

That said, algorithms change. So is it still worth it to work with bloggers?

Bloggers Are Worth Far More Than Their Links When It Comes to SEO

As a digital marketer, you are probably tasked with a series of KPIs – from rankings and traffic to conversion rates and revenue.

Is it possible to have bloggers help with all these goals?

The trick is to first stop thinking of websites and blogs as simply places to acquire links. Bloggers are influencers and need to be approached as such.

As a brief primer to influencers, the full version of which you can read in the Ultimate Guide to Influencer Marketing, there are several components to consider:

The primary goal you are focused on.

The buyer persona you wish to target to meet that goal.

The type of influencer needed that best fits the buyer persona.

What medium best suits that influencer type.

Influencer marketing is a deep subject, which during presentations we attempt to distill down into “having someone else tell your story for you.”

But given where this think piece is published, we can make a few assumptions with regards to online vs. offline intent.

What Is Your Primary Goal?

Hopefully, you didn’t say links.

Even while addressing SEO, this is not really the goal, even if it is what you might spend a good percentage of your time focused on. SEO pros also do not exist simply to supply vanity rankings, at least not for the long term.

As with most online marketing functions, the end goal is usually high margin revenue.

How does one maximize high margin revenue? By focusing on increasing the volume of relevant traffic.

Search engines are just a convenient source of this traffic.

Who Is Your Buyer Persona?

The specific end buyer for your product or service is going to be unique to your specific situation. You can go more in-depth with cultural, social, personal, and psychological factors here but for the sake of brevity, you make some assumptions related to their technical capabilities.

In this abbreviated approach, what you will need to care about most is categorical focus and fit – what type of information is your buyer consuming that is at least tangentially relevant to what you are selling? That is the category of content you need to be consistently associated with.

What Influencer Type Works Best?

In the above guide, much attention is given to the distinctions between aspirational, authoritative, and peer influencers and the various situations you might need to use each type.

When it comes to raw traffic, aspirational influencers provide the most volume, but are not always categorically focused unless you are selling a product with broad mass appeal. Conversely, peer influencers can be acquired with exceptional categorical fit yet yield much less traffic.

Authoritative influencers, for the sake of the use case of digital marketers eyeing search, are a happy compromise.

They are subject matter experts and as such are extreme category fits, provided you are only approaching relevant influencers and can drive more traffic than that of lesser-known industry peers.

What Medium Best Suits the Authoritative Influencer Type?

There are multiple mediums that work well within the confines of a search focus, which can be used before and after the primary traffic drivers are created.

The title of this article does not bury the lede though: for search purposes, the primary medium to work on with authoritative influencers is blogs, of which there are multiple ways to approach.

Let’s discuss why, and what signals blogs are helping to address, then how to implement such a strategy.

Which Search Signals Matter with Influencing Bloggers?

Simplifying for brevity once again, it can be helpful to view modern search algorithms as operating in three general buckets – signals associated with:

Content/architecture.

Links.

User experience.

With the right influencers, all three buckets can be satisfied.

Before any links can ever be built, there should exist at least some meaningful content to point to. One method of this meaningful content is to hire a reasonably well-known authoritative influencer within your industry to write a series of deep and engaging pieces.

Having this content created can satisfy a multitude of content signals, not the least of which is associating the entity of this writer with your domain, leeching off of their extensive expertise.

This is not about accepting a guest post! It is about recruiting an exceptional writer within your niche to create something meaningful in your domain space.

For all the recent talk of E-A-T (expertise, authoritativeness, and trustworthiness), recruiting an expert that can convey authority and associate their accumulated trust with you allows you to cross a hurdle.

You already know that bloggers provide links, but the links you should care about have far less to do with DR, DA, or whatever metric you have been using. Reach out to influencers for the purpose of acquiring links that provide relevant, converting traffic.

When your focus shifts more to acquiring links that pass converting traffic, your mental model shifts closer to what your main goal of a digital marketer should be: high margin revenue, that just so happens to be search engine-proof.

This mental shift means you can worry less about whether the influencer wants to use “nofollow” and more about how closely themed the blog’s audience is with the buyer persona you are targeting.

The bonus point in acquiring relevant, converting traffic is how it also satisfies a variety of user signals.

The traffic coming in is relevant enough to linger and dwell, navigating to the most important sections, and entering some aspect of your conversion funnel.

When traffic is truly relevant, there also exists the possibility of secondary branded searches occurring to seek out deeper content in the future, which increases the probability of attaining a higher percentage of repeat users. In the opinion of many top SEO pros, this is an especially important signal.

Update the detailed information about Facebook Visited More Than Google In 2010, Traffic Analyst Firm Says on the Daihoichemgio.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!