The Importance of Quality Meta Tags

The Importance of Quality Meta Tags

Page titles and meta descriptions are key elements of your website’s content. Let’s explore what page titles and meta descriptions are, why they are important and recommendations for best practices. Definitions Page Title (aka Title Tag) A Page Title is an accurate and short designation of a page’s content. This short description helps a user to determine which websites are worthy of a click. Oftentimes, the structure of the Page Title is set up with a pipe to separate the page topic from the brand name, like this: Featured Clients | webShine. Page Titles appear in the browser tab: Page Titles also show up in the snippet (summary) on the Search Engine Results Page (SERP). Meta Descriptions These are the short summaries of web pages that appear in SERPs and on related content pages, such as blogs.If a user searches for keywords that are found in the meta description, they are bolded. This emphasis focuses the search results on meta descriptions that use highly relevant keywords and gives greater visibility to your brand. Important? In short, yes, page titles and meta descriptions are hugely important.  They provide you with an opportunity to market your webpages in the Search Engines Results Pages as noted in the images above.   They also give the search engines insight as to a page’s primary content focus in the header of each page.   Historically, they provided an opportunity to infuse keywords in a page. They still do act as an area of opportunity for keyword usage, but we suggest a softer sell for both users and search engines to meet 2016 best practices. Recommendations SEO...
Google is No Longer Showing Right Side Ads on the Desktop

Google is No Longer Showing Right Side Ads on the Desktop

Google continues to tweak the way it presents search results. This latest change has ramifications for both organic and paid search campaigns. The Change Google is removing text ads from the right side of desktop search and placing ads at the top and sometimes bottom of the page. For “highly commercial” search terms, there may be 4 ads instead of 3 at the top of Search Engine Results Pages (SERPs), as shown below. Although ads will be removed from the right-side, Product Listing Ads (PLAs) and Knowledge Graph boxes can still appear in this space. Reasons for The Change Google has shown that it values mobile search and wants to align the desktop experience with the mobile experience. Since smartphones can only display one column on the screen, no right sidebar ads can appear. By removing the right-side ads from desktop search, the results will be similar between the desktop and mobile screens. Another speculation for the change is that Google realizes that click-through-rate for right-side ads is poor. New elements such as Knowledge Panels have pressed right-hand ads down the page. The expected cost-per-click (CPC) inflation from focusing on the top of the page is expected to be more profitable in the long run. What Does this Change Mean for My Business? With 4 paid search ads taking up top of the page real estate, the organic search results may appear “below the fold”, an area that searchers need to scroll down to see. This results in organic space is being even more precious and leads to an increased laser-like focus on SEO. It is possible that click...
301 Redirects: What Are They, and Why Are They Important?

301 Redirects: What Are They, and Why Are They Important?

Perhaps you’ve redesigned or updated your website recently. During these changes, some of your pages may have moved or been deleted. This begs the question: What happens when a user tries to access those pages that are no longer at the web address (URL)? If the unavailable pages are not redirected, clients may get a page like this when clicking a bookmark or by typing in the address. To avoid sending customers to your 404 “This page cannot be found” error, it’s best to use a 301 redirect. A 301 redirect is a permanent redirect that points one URL (the old one) to another URL (the new one). To put it another way, a 301 redirect says “The page that used to exist here is now at this location.” When users land on a 404 error page, it creates a poor user experience, search engine ranking power is lost and repeat visitors disappear. No one wants this to happen! “301” refers to the web status code for this type of redirect.  They also impact the search engines who have indexed the original content and will continue to look for it at the original URL until directed otherwise via the 301 redirects. How are 301 redirects put into practice? It’s relatively easy to implement redirects. First run a “crawl” on your site to find the addresses that are missing or changed. Then take the crawl export and match new pages with the phantom pages. Third, implement the redirects on your website in your content management system or htaccess file. With properly implemented 301 redirects you can avoid lost ranking, web...
New Google Algorithm Gives Weight to Mobile Sites on April 21

New Google Algorithm Gives Weight to Mobile Sites on April 21

With Google’s new algorithm release on April 21, they’ll expand the role that mobile friendliness plays as a ranking factor. Why is Google making this change? We all know that people are increasingly using their mobile devices to access the internet. Last year use of the internet underwent a paradigm shift: the number of mobile users surpassed desktop users. This is a colossal change and Google needed to update their formula to address the shifting landscape of search. What will the new algorithm focus on? Google wants to make sure that people have a good mobile experience. To achieve this goal they changed their search algorithm to give weight to mobile web pages when accessed on mobile devices. This won’t affect desktop search. It doesn’t make sense to get a mobile search result on your desktop or vice versa. How do I know if my website conforms to Google’s new algorithm? Since mobile friendliness is now being used as a ranking factor, it’s imperative that your website uses mobile strategies. One evaluation tool is the Mobile Friendly Tool, shown below. It lists factors to change to assure compliance. For webShine clients, we’ll use this tool’s feedback to audit your sites and update, if needed.   Another tool that helps with mobile worthiness is the Google Webmaster Mobile Usability Report, shown below. We’ll monitor this report for our clients and make updates where needed. If you’re not a webShine client and want to see your Mobile Usability Report, follow these steps. 1. Log in to Google Webmaster Tools 2. Click on your client name 3. Click Search Traffic 4. Click...
Standard for Robot Exclusion: Excluding Robots Since ’94

Standard for Robot Exclusion: Excluding Robots Since ’94

The Standard for Robot Exclusion, which you may know as a robots.txt file, just turned 20 years old. To mark its two decades in existence, we thought it would be illuminating to take a closer look at the robots.txt files, and how they are used in today’s world. A blog post written by Brian Ussery on the topic is very educational, and illustrates this file’s complexity. Back in the ’90s, robots essentially ran unchecked on the web, and poked around in branches of certain websites in which they had no business. To limit access, the Standard for Robot Exclusion came into being. Though its purpose is a simple one, of playing bouncer to robots, the nuances of a robot’s response to the robots.txt file is intricate. To demonstrate, prohibiting robots from certain areas of a website does not guarantee their exclusion from appearing on a search engine results page (SERP). Because search engines operate on the premise of indexing the whole of the web available to them in order to deliver the best results, if a search engine recognizes a URL on a website that appears to be relevant to a certain search query, that search engine can bring up that URL on a SERP even if the URL is blocked by a robots.txt file. So, how then does a webmaster block robots while simultaneously ensuring the exclusion of a webpage on a SERP? This double exclusion can be achieved by using a meta tag on the page(s) in question. Place the following in the head section of the page(s). meta name=”robots” content=”noindex” Again, when dealing with robots, instructions...

SEO or Inbound Marketing?

Last week I attended MozCon in Seattle, an in-bound marketing conference with a focus on SEO. My primary goal was to ensure that our search engine optimization campaigns are operating at the forefront of the industry and that we are doing everything we can to drive quality traffic and sales for our customers. Secondly, I was curious as to the evolution of SEO and inbound marketing. Industry thought leaders are moving away from specialization and towards the broader and more nebulous term, inbound marketing. Take for example the host of the event, formerly SEOmoz and now Moz. The primary argument for the shift is that to be successful in SEO, a marketer needs to go beyond on-page optimization, technical SEO and link building. Success in organic search includes email marketing, conversion optimization, content marketing, social media and more. Thus as SEOs a more general term is needed to describe what it is that we do. Can I continue to focus exclusively on SEO without getting left behind? Is webShine in fact an inbound marketing company despite our claims to be a search engine marketing agency? Are we limiting our services and restricting our customers? In short, yes, no and no. At webShine, we’ll continue to focus on technical aspects of SEO and while also increasing our skills in search engine marketing. Our operations go beyond on-page optimization, link building and technical SEO into the additional arenas required to be successful with today’s search engines. To do so, webShine will continue to collaborate with teams in each marketing arena which is not exclusively our role, but is required for success...