One of the first things you should consider when optimizing your blog site on the internet is the on page SEO. Or the optimizing the content on the page in a way that makes it favorable for search engines to find. That is what you want right? People to find your blog. Ok great! Well this post will give you simple tricks on how to optimize your blog site for search engines. Let’s get people to your blog!
The first and most important aspect is the url . The url should contain your main keyword. The most important keyword that you really want people to look for or maybe the main keyword/small phrase that best describes what your page is about. Think about it for a second. When people are searching for a term or phrase and they see it in the url, it most likely means that the website is most relevant to what they are looking for. Make sense?
The second most important detail of optimizing your blog is the title of your page. The title should be a short phrase that is longer than the url in word count but not too stuffed with keywords. It should look natural and tell the reader what the article is about. No magic here.
Is content King? Well I don’t think it is the make it or break it of a blog. But, you need good and unique content. You really do need both of those elements though. The content engages the reader and keeps them there. You don’t want high bounce rates from poor written or boring content. That is bad SEO. We want nice SEO.
Put pictures and videos on your page!
Nobody likes to stare at a blank page with 1500 words on it. People want to be entertained and get an idea of what your page is about at a glance. Then they will read it. I hope this has been a good guide for you. This should be your start point before you build your pages to begin with. Search engine optimization is all about relevancy. How relevant is the page you want people to see given the information in the url, title, and content? Just keep that in mind when you are building your blog.
When it comes to brick-and-mortar storefronts, local businesses often struggle to compete with neighboring big brands. Statistics show that, even for a well-known local store that’s established a strong relationship with its customers and built a community through the years, having such a neighbor can be detrimental. But what about a newly opened business? Does it have any chance of competing with popular brands? My experience has led me to believe there’s only one way a locally owned business can overcome big competition: it needs to take advantage of local SEO.
Recently, in collaboration with Accuranker, I conducted a survey that touches upon the difficulties local businesses face when trying to become visible in Google’s local results. We analyzed more than 300,000 local SERPs across multiple industries (beauty, medical services, auto services, legal, shopping, etc.) to get a clear understanding of what the chances are for a local site to seem attractive to Google.
One of the more curious insights our research revealed is that the legal services niche is among the most competitive. Sure, this finding isn’t rocket science. In fact, I bet on some level you were aware of this (or at least you had a gut feeling). However, this issue is much more complex than it seems. The legal services niche far surpasses other niches in terms of competition and prices.
Does this mean that the legal services niche falls under radically different rules and requires unique SEO tactics? This is exactly the question I set out to answer, and you’re most welcome to follow me on my little investigation!
Gathering the data for this article
After reading this article, you’ll understand the biggest challenges that any legal website faces when trying to become visible in the SERPs. The data here will help ensure that your future strategies are based on informed decisions. Moreover, you’ll be able to streamline your creative process and find non-standard approaches that will cement your success in the legal industry.
To conduct proper research on what SEO strategies local businesses employ in the legal services niche, I took the following steps:
I made a list of keywords unrelated to any brand (which could hardly be classified as local).
I identified the most competitive places in the US for this industry in order to analyze how legal sites build a presence in this extremely aggressive environment
The first step was simply to do keyword research, which involved a bit more manual work than usual – I tried my best to filter out branded keywords and ones that weren’t relevant to local searches.
With the help of Statista I was able to get a list of the states in America that have the highest employment rates in the legal niche:
This graph shows US states with the highest number of employees in legal occupations in the United States as of May 2014. Source: statista.com
You can see that California, New York, and Florida have the highest number of employees in this industry, hence these locations are the most “densely populated” by law firms and lawyers, and, as a result, the competition in these states should be higher than in other states. After I made a list of the most competitive locations, I was ready to move on to the next step – analyzing the domains that appear in SERPs for the keywords I had previously selected.
Now let’s see what my findings revealed.
The top 5 SEO challenges for the legal niche
The extreme competitiveness of the legal services niche might be explained by the fact that this market generates more than $248 billion USD in revenue (according to a recent report provided by Statista) with only a relatively small number of searches.
To give you a better understanding of the size of the legal services industry in the US, let’s compare it with a bigger market: for instance, if we look at ecommerce, we can clearly see that the revenues generated by the two niches in question are nearly the same (ecommerce sales surpass $256 billion USD), despite the fact that ecommerce traffic share figures are four times greater than in legal services. It’s safe to say that the legal niche has turned out to be a ridiculously competitive market, because it’s an outrageously profitable one. I’m also certain that the success of any SEO activity depends on a deep understanding of how the industry and its major players work.
In the next section, you’ll learn about the main challenges that legal businesses face.
#1. Online legal business are dominating local SERPs
Statistics from an IbisWorld report confirm that the online legal services niche was able to generate $4 billion USD in 2015. Moreover, in recent years this niche has been steadily expanding due to the fact that consumers are interested in getting legal services online. That’s why it doesn’t come as a surprise that a company named Rocket Lawyer generates more than 30,000 searches monthly (according to Google Keyword Planner) by helping users deal with their legal issues online. This number of searches proves that online legal services are gradually becoming popular, and people don’t want to spend their time scheduling an appointment with a lawyer anymore.
Now you’re probably wondering how this trend is affecting local SEO, right?
Knowing that New York, Miami, and Los Angeles are among the most competitive locations for the legal niche, I decided to find out which sites are the most visible in local search results there. I took into account more than 500 different keywords related to legal services and compiled a list of the domains that most appeared most frequently for those keywords. And here are the top three domains that remain visible in local search results in all three cities:
After making this list, I double-checked these websites to make sure that all of them belong to the online legal services niche. I also decided to dig deeper and manually checked the top twenty domains that were most visible across all the locations I analyzed, in order to understand what kind of legal services they provide. I found out that 55.6 percent of the sites I analyzed belong to the online legal services niche. That means that local businesses now have to compete not only with global businesses, but also with online legal businesses that, by default, have better positions in SERPs, as the main goal of their business is to increase their online presence by getting more organic traffic from Google.
#2. Google doesn’t give priority to local legal businesses in organic search results
Apart from the strong presence of online businesses in local organic SERPs, I was struck with the steady visibility of the top twenty websites that appear in local search results in New York, Los Angeles and Miami. The shocking truth I discovered about Google local SERPs is that less than 20 percent of sites were unique across all the studied locations. This means that search results are occupied by global and online businesses in 80 percent of cases. Furthermore, the top three most visible domains remain the same in all three cities, and they are as follows:
I also discovered that all three of these websites belong to the online legal services niche, and, despite SEO visibility, have a good number of backlinks. I am of the opinion that local businesses have no chance of competing with them whatsoever.
As I studied the 20 percent of websites that are unique, two curious cases of locally based businesses caught my eye – Injurylawyers.com and Cellinoandbarnes.com. Let’s take a closer look at these two websites.
From Injurylawyers.com’s “Contact” page, I learned that it operates mostly in Florida. However, I don’t think that the reason it ranks so highly in local search results in Miami is because of its physical presence there. Even at a quick glance, it becomes clear that Injurylawayers.com is ranking so high in local results because of its website’s overall performance. As you can see from the screenshot below, its website has a good number of referring domains, as well as a decent amount of organic traffic:
Another site that caught my attention – Cellinoandbarnes.com – has a branch based in New York. The history of this legal company begins over 50 years ago, and without any doubt Cellino and Barnes is a well-known and trusted bran. Plus, Google recognizes it as a brand. The very fact that its brand name is being searched for more than 6,000 times a month speaks volumes about the trustworthiness of this legal company:
All these facts show that Cellinoandbarnes.com’s visibility in New York SERPs is because of the domain’s general performance in Google US organic search results:
My quick research proves that, in practice, Google doesn’t give priority to NY-based legal companies and still mostly relies on general ranking factors. And it seems obvious now that any online business can easily outperform an offline SMB legal company by increasing the number of backlinks, brand mentions, and site visits it receives.
#3. The local pack is still a saving grace for local businesses
One year ago, Google implemented a major change that dramatically minimized local businesses’ chances of becoming visible in local packs: Google replaced the 7-pack in SERPs with a 3-pack. And I was quite interested to figure out what kinds of businesses now hold these three positions in the legal niche, and whether these results are local.
Despite the fact that local organic SERPs are fully occupied by big online businesses, the local pack still is the best way to remain present in Google for locally based legal companies. My research revealed that 67 percent of sites that appear in local packs for legal services are hyper-local and local. To arrive at this percentage, I analyzed the domains that appear in local packs in New York, Miami, and Los Angeles in terms of their SEO performance in Google US (to do this, I used Serpstat’s Batch analysis tool).
I was also curious what share of online presence the local legal businesses that appeared in the local pack had, along with the breakdown by states. To mark sites as local, I checked their their traffic with the help of the Serpstat’s Batch Analysis Tool. (I’d like to note that I find Serpstat’s figures most relevant for such purposes, as they parse raw data from Google US. You can easily spot which sites are global and which are local.) And here’s what I found:
Miami – 60% of legal websites appear in the local pack
Los Angeles – 35% of legal websites appear in the local pack
New York – 15% of legal websites appear in the local pack
This was quite an insight, since I assumed that California would be the most competitive location for the legal niche, because – as you may recall from the beginning of this post – it’s the state most densely populated by law firms. Also, it’s surprising to find New York only at third place in this list. Yet, as you can see, Miami has the greatest number of local sites that are present in local pack. Therefore, I believe that being featured in local search results in New York requires a lot more resources than it does to achieve the same visibility in Miami. And this is something that every SEO expert should be aware of.
#4. You can’t stand out without a site – even in local pack results
It’s a well-known fact that Google’s local pack provides businesses with the opportunity to appear at the top of Google SERPs even without a website. According to my previous research, which I conducted in collaboration with the AccuRanker team, the local pack works much better for less competitive niches. What I tried to clarify here is whether you can stand out in a local pack without a website in such an unconventional and competitive niche as legal services. Unfortunately, no, you cannot.
To prove this, I analyzed 986 local SERPs in order to figure out if legal brands can appear with or without a website. My findings showed that 86 percent of legal businesses that pop up in local packs have a website. This means that even if your business is visible in local packs without a website, in a majority of cases, it’ll be considered by potential clients as less trustworthy, since users usually expect to see a link to a particular domain.
Without a link to a professional-looking website, your business will seem less credible – not only to potential clients, but also to Google. Nevertheless, it’s not unusual for large, global companies to be trusted more than small, local ones. Therefore, small companies need to instill confidence in their potential clients by having a website.
#5. There’s no correlation between a legal website’s ranking number one in a local pack and its number of reviews
I’m certain that every business owner understands the importance of customer reviews. It’s a no-brainer that a level of trust is instantly established when a potential client sees that a local business has reviews. And it definitely increases the likelihood of said client to convert. Also, the very presence of Google native reviews is thought to be among the Top 50 local search ranking factors.
However, this study of legal services has already revealed that there are quite a few peculiar ranking factors that business owners need to keep in mind in order to succeed in this niche. That’s why I was curious to know whether there’s any correlation between a site’s number of customer reviews and its ranking #1 in a local pack.
With the help of the AccuRanker team I was able to get the sum of reviews that show beside each result in local pack. Afterwards, I analyzed more than 2,000 local SERPs in New York, Los Angeles, and Miami. And here’s what I found:
There’s no correlation between ranking in the first position in a local pack and your number of reviews.
For instance, in New York local pack results, the companies that appear in the third position have 824 total reviews. Those that appear in the first – 732. Moreover, I noticed a good number of cases in which a company that had a solid number of reviews was ranked in the third position, while a business that hadn’t even been reviewed yet was ranked in the first.
Another striking insight I gained: most legal sites never show their potential visitors more than 2 reviews. Based on this data, I can say that this represents an overall industry trend of a lack of native Google reviews. That’s why Google ranks businesses that haven’t been reviewed so highly. Even if you have a significant number of customer reviews, it won’t help your business rank higher in local pack results.
One final note
Without any doubt, the legal niche presents a lot of unique local SEO challenges that other industries hardly ever face. The high penetration of online legal services into the existing legal market is changing the current business landscape – in particular, it’s drastically affecting local results. Online legal businesses are stealing an outrageous amount of web traffic from local companies, without giving them even a slim chance of ranking as well in local SERPs.
Fortunately, local legal businesses still have priority in local packs, but the highly competitive environment is forcing them to improve their online presence by creating a website. Since a majority of the companies that appear in local packs have sites, your potential clients’ expectations are ratcheting up. In fact, this trend may reinforce searchers’ opinions that businesses without a website are untrustworthy. Furthermore, it seems that Google also prefers to show users local legal businesses that have a site, rather than those that don’t. The only good news is that your number of reviews doesn’t really influence your rankings in local packs.
Still, if a local legal business is interested in attracting clients via the Internet, it shouldn’t hesitate to look for alternative ways of generating traffic in both organic and paid search channels.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!
Through patent filings over the years, Google has explored many ways that it might use “freshness” as a ranking signal. Back in 2011, we published a popular Moz Blog post about these “Freshness Factors” for SEO. Following our own advice, this is a brand new update of that article.
In 2003, Google engineers filed a patent named Information retrieval based on historical data that shook the SEO world. The patent not only offered insight into the mind of Google engineers at the time, but also seemingly provided a roadmap for Google’s algorithm for years to come.
In his series on the “10 most important search patents of all time,” Bill Slawski’s excellent writeup shows how this patent spawned an entire family of Google child patents–the latest from October 2011.
This post doesn’t attempt to describe all the ways that Google may determine freshness to rank web pages, but instead focuses on areas we may most likely influence through SEO.
Giant, great big caveat: Keep in mind that while multiple Google patent filings describe these techniques – often in great detail – we have no guarantee how Google uses them in its algorithm. While we can’t be 100% certain, evidence suggests that they use at least some, and possibly many, of these techniques to rank search results.
Google may determine exactly which queries require fresh content by monitoring the web and their own huge warehouse of data, including:
Search volume: Are queries for a particular term spiking (i.e. “Earthquake Los Angeles”)?
News and blog coverage: If a number of news organizations start writing about the same subject, it’s likely a hot topic.
Social media: A spike in mentions of a particular topic may indicate the topic is “trending.”
While some queries need fresh content, other search queries may be better served by older content.
Fresh is often better, but not always. (More on this later.)
Below are ten ways Google may determine the freshness of your content. Images courtesy of my favorite graphic designer, Dawn Shepard.
1. Freshness by inception date
Initially, a web page can be given a “freshness” score based on its inception date, which decays over time. This freshness score may boost a piece of content for certain search queries, but degrades as the content becomes older.
The inception date is often when Google first becomes aware of the document, such as when Googlebot first indexes a document or discovers a link to it.
“For some queries, older documents may be more favorable than newer ones. As a result, it may be beneficial to adjust the score of a document based on the difference (in age) from the average age of the result set.”
2. Amount of change influences freshness: How Much
The age of a webpage or domain isn’t the only freshness factor. Search engines can score regularly updated content for freshness differently from content that doesn’t change. In this case, the amount of change on your webpage plays a role.
For example, changing a single sentence won’t have as big of a freshness impact as a large change to the main body text.
“Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time.”
In fact, Google may choose to ignore small changes completely. That’s one reason why when I update a link on a page, I typically also update the text surrounding it. This way, Google may be less likely to ignore the change. Consider the following:
“In order to not update every link’s freshness from a minor edit of a tiny unrelated part of a document, each updated document may be tested for significant changes (e.g., changes to a large portion of the document or changes to many different portions of the document) and a link’s freshness may be updated (or not updated) accordingly.”
3. Changes to core content matter more: How important
Changes made in “important” areas of a document will signal freshness differently than changes made in less important content.
Less important content includes:
Conversely, “important” content often means the main body text.
So simply changing out the links in your sidebar, or updating your footer copy, likely won’t be considered as a signal of freshness.
This brings up the issue of timestamps on a page. Some webmasters like to update timestamps regularly – sometimes in an attempt to fake freshness – but there exists conflicting evidence on how well this works. Suffice to say, the freshness signals are likely much stronger when you keep the actual page content itself fresh and updated.
4. The rate of document change: How often
Content that changes more often is scored differently than content that only changes every few years.
For example, consider the homepage of the New York Times, which updates every day and has a high degree of change.
“For example, a document whose content is edited often may be scored differently than a document whose content remains static over time. Also, a document having a relatively large amount of its content updated over time might be scored differently than a document having a relatively small amount of its content updated over time.”
Google may treat links from these pages differently as well (more on this below.) For example, a fresh “link of the day” from the Yahoo homepage may be assigned less significance than a link that remains more permanently.
5. New page creation
Instead of revising individual pages, fresh websites often add completely new pages over time. (This is the case with most blogs.) Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.
“UA may also be determined as a function of one or more factors, such as the number of ‘new’ or unique pages associated with a document over a period of time. Another factor might include the ratio of the number of new or unique pages associated with a document over a period of time versus the total number of pages associated with that document.”
Some webmasters advocate adding 20–30% new pages to your site every year. Personally, I don’t believe this is necessary as long as you send other freshness signals, including keeping your content up-to-date and regularly earning new links.
6. Rate of new link growth signals freshness
Not all freshness signals are restricted to the page itself. Many external signals can also indicate freshness as well, oftentimes with powerful results.
If a webpage sees an increase in its link growth rate, this could indicate a signal of relevance to search engines. For example, if folks start linking to your personal website because you’re about to get married, your site could be deemed more relevant and fresh (as far as this current event goes.)
“…a downward trend in the number or rate of new links (e.g., based on a comparison of the number or rate of new links in a recent time period versus an older time period) over time could signal to search engine 125 that a document is stale, in which case search engine 125 may decrease the document’s score.”
Be warned: an unusual increase in linking activity can also indicate spam or manipulative link building techniques. Search engines are likely to devalue such behavior. Natural link growth over time is usually the best bet.
7. Links from fresh sites pass fresh value
Links from sites that have a high freshness score themselves can raise the freshness score of the sites they link to.
For example, if you obtain a link off an old, static site that hasn’t been updated in years, this may not pass the same level of freshness value as a link from a fresh page, i.e. the homepage of Wired. Justin Briggs coined this FreshRank.
“Document S may be considered fresh if n% of the links to S are fresh or if the documents containing forward links to S are considered fresh.”
8. Traffic and engagement metrics may signal freshness
When Google presents a list of search results to users, the results the users choose and how much time they spend on each one can be used as an indicator of freshness and relevance.
For example, if users consistently click a search result further down the list, and they spend much more time engaged with that page than the other results, this may mean the result is more fresh and relevant.
“If a document is returned for a certain query and over time, or within a given time window, users spend either more or less time on average on the document given the same or similar query, then this may be used as an indication that the document is fresh or stale, respectively.”
You might interpret this to mean that click-through rate is a ranking factor, but that’s not necessarily the case. A more nuanced interpretation might say that the increased clicks tell Google there is a hot interest in the topic, and this page – and others like it – happen to match user intent.
For a more detailed explanation of this CTR phenomenon, I highly recommend reading Eric Enge’s excellent article about CTR as a ranking factor.
9. Changes in anchor text may devalue links
If the subject of a web page changes dramatically over time, it makes sense that any new anchor text pointing to the page will change as well.
For example, if you buy a domain about racing cars, then change the format to content about baking, over time your new incoming anchor text will shift from cars to cookies.
In this instance, Google might determine that your site has changed so much that the old anchor text is now stale (the opposite of fresh) and devalue those older links entirely.
“The date of appearance/change of the document pointed to by the link may be a good indicator of the freshness of the anchor text based on the theory that good anchor text may go unchanged when a document gets updated if it is still relevant and good.”
The lesson here is that if you update a page, don’t deviate too much from the original context or you may risk losing equity from your pre-existing links.
10. Older is often better
Google understands the newest result isn’t always the best. Consider a search query for “Magna Carta.” An older, authoritative result may be best here.
In this case, having a well-aged document may actually help you.
Google’s patent suggests they determine the freshness requirement for a query based on the average age of documents returned for the query.
“For some queries, documents with content that has not recently changed may be more favorable than documents with content that has recently changed. As a result, it may be beneficial to adjust the score of a document based on the difference from the average date-of-change of the result set.”
A good way to determine this is to simply Google your search term, and gauge the average inception age of the pages returned in the results. If they all appear more than a few years old, a brand-new fresh page may have a hard time competing.
Freshness best practices
The goal here shouldn’t be to update your site simply for the sake of updating it and hoping for better ranking. If this is your practice, you’ll likely be frustrated with a lack of results.
Instead, your goal should be to update your site in a timely manner that benefits users, with an aim of increasing clicks, user engagement, and fresh links. These are the clearest signals you can pass to Google to show that your site is fresh and deserving of high rankings.
Aside from updating older content, other best practices include:
Create new content regularly.
When updating, focus on core content, and not unimportant boilerplate material.
Keep in mind that small changes may be ignored. If you’re going to update a link, you may consider updating all the text around the link.
Steady link growth is almost always better than spiky, inconsistent link growth.
All other things being equal, links from fresher pages likely pass more value than links from stale pages.
Engagement metrics are your friend. Work to increase clicks and user satisfaction.
If you change the topic of a page too much, older links to the page may lose value.
Updating older content works amazingly well when you also earn fresh links to the content. A perfect example of this is when Geoff Kenyon updated his Technical Site Audit Checklist post on Moz. You can see the before and after results below:
Most important, be useful.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!