This is a guest contribution from Larry Alton.
Web developers, content managers, marketing teams, and many other online professionals rely on Google Analytics to understand visitor trends. However, you can run into a significant amount of noise, which can skew your Google Analytics numbers and your subsequent interpretations of this data.
Luckily, you can filter out certain types of traffic, so that your numbers don’t get watered down by your own traffic, Web crawlers, or duplicated because of web address letter case discrepancies. Here are three main filters to consider setting as you move forward with a Google analytics strategy.
Cutting Out Internal Traffic
Every time you and your colleagues navigate throughout your website, it can skew your traffic numbers. Luckily, you can filter these out of your Google Analytics reports, so that you get a more accurate representation of your traffic.
Just head over to your Admin page and select “Filters” under the “View” column. Next, click on “+New Filter” and make sure that the “Create New Filter” bubble is selected.
Name your filter something like “Exclude office traffic” or “Exclude home traffic.” Choose the “Custom Filter” option, then select “IP address” from the dropdown menus.
When you enter the IP address in the Filter pattern field, you’ll need to use backslashes before each dot, according to Google’sregular expressions requirements.
Excluding Bots and Spiders
It can be extremely frustrating to examine your web traffic data, only to see that certain recurring bots and spiders are accountable to a large chunk of the pie. Luckily, Google istaking proactive measures to protect Analytics users from these annoyances.
You can opt into Google’s automated bot and spider filtering by going to your Admin panel, clicking on “Reporting View Settings” and checking off the box that reads, “Exclude all hits from known bots and spiders.” However, some bots and spiders will still be able to leak through. You can target these individual irritants by creating a new filter, selecting “Custom” and then choosing “Visitor ISP Organization.” Then enter the service provider of the bot using a regular expression.
Keep an eye on your analytics, and be sure to create manual filters for additional bots that attempt to sneak past you. This can prevent bothersome bots and spiders from skewing your website’s data.
If visitors enter an URL into their browser or click links that use a mix of uppercase and lowercase characters, then you could wind up with duplicate Google Analytics entries for the same destination. Luckily, you can fix this issue by creating a filter.
Just create a brand new filter and call it something like “Force Lowercase.” Choose “Custom,” click on the “Lowercase” bubble, and select “Request URI.” Once this is done, you should stop seeing multiple entries when browsers load up a page using different letter cases.
Increase the accuracy of your Google Analytics traffic data by using filters to cut through the noise. Don’t allow your metrics to become skewed by your own internal traffic, spiders and bots, or by web addresses that contain a mixture of letter cases.
Originally at: Blog Tips at ProBlogger