Thursday, November 30, 2017

One More Link for Dixon Jones

Woke up this morning to see that Dixon Jones, who runs marketing for Majestic, will be transitioning out of his role and into retirement. Majestic has long proven itself to be a top shelf (maybe even THE top shelf, but more on that later) link index tool and his contributions to the industry have been equally top tier. I’m both sad and elated at the news. Dixon has been a huge help in getting the Local SEO Ranking Factors off the ground, graciously providing us backlink data for the first ever version of the study and has been a valuable partner ever since. But it’s always a good thing when people get to pivot or scale back their careers on their own terms. Cheers Dixon, you will be sorely missed.

http://ift.tt/2BxsyZA

What does the redirects manager in Yoast SEO do?

The redirects manager in Yoast SEO Premium is a real lifesaver. It’s a feature we at Yoast use many times a day. Once you used it for a while, you wonder how you ever lived without it. The redirects manager makes everyday website optimization and maintenance a piece of cake. It takes care of all redirect tasks, so you don’t have to think about that as much. In the end, it will save you lots of time and money. Here, we’ll shed some more light on the invaluable redirects manager in Yoast SEO.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin Info What is a redirect?

Before we get into the awesomeness of the Yoast SEO redirects manager lets take a brief look at redirects. A redirect happens when a particular URL is deleted or changed and the browser gets served another URL in exchange. If a site owner deletes a page and does not redirect that old page, visitors to that page will see a 404 error message/page. So, to send visitors to a substitute URL or another relevant page, you need a redirect.

There are loads of reasons for why you would need a redirect:

  • When you delete a post or page;
  • When you change an URL structure;
  • If you move from HTTP to HTTPS;
  • Whenever you move a domain;
  • If you edit the slug of a category;
  • Etc.

Historically, deleting a page and making the correct redirect was a nasty chore. You had to do it manually in the .htaccess file or with scripts on the server-side, like Apache’s mod_rewrite or ngix rewrite module. In all cases, there was code involved. Not something anyone was remotely comfortable doing. Today, with Yoast SEO Premium that process is dead easy. If you are in need of a WordPress redirect plugin, give this one a try!

What does Yoast SEO do with redirects?

Using Yoast SEO Premium, making a redirect becomes a straightforward process. It takes just a couple of quick steps. Let’s say you want to delete a post:

  • Open the post that needs to be deleted
  • Move it to trash
  • Choose if it should receive a 410 content deleted redirect or a redirect to another page
  • Hit OK and you’re done!
  • Easy peasy, right?

redirect deleted post redirects manager

As you can see, the redirects manager in Yoast SEO Premium is an incredibly simple tool to work with redirects. It asks you what you want to do with an old URL whenever you change or delete a post or page. This process takes place in the redirects manager or the post editor. The tool asks you if you want to redirect the post to another URL or to serve a 410 content deleted header, for instance.

Correctly redirecting pages keeps your site usable, fresh and healthy. Visitors won’t stumble upon dead links and neither would Google. Google loves sites that are perfectly maintained. The cool thing is that everyone can do this and you won’t even need to call in your developer to fix it for you.

Not sure how the redirects manager in Yoast SEO works? Check this video and it becomes much clearer:

Types of redirects

The redirects manager supports the most essential redirects. Below you can find the supported redirects. If you need more information about these different redirects, please read the Which redirect post. Want to know the difference between a 302 and a 307? We’ve got you covered which this post on HTTP status codes.

  • 301 – Moved permanently
  • 302 – Found
  • 307 – Temporary redirect
  • 410 – Content deleted
  • 451 – Content unavailable for legal reasons
Inside the redirects manager in Yoast SEO

The redirect manager can do a lot more cool stuff. You can bulk edit your existing redirects to, for instance, change them from a 307 to a 301. Or you can filter for redirects to see which ones need changing or you can find a specific redirect on an article and change it to something else.

edit redirect redirects manager

Integrates with Google Search Console

If combined with the power of Google Search Console, you’ll get the ultimate in site maintenance power at your fingertips. Let Yoast SEO Premium access your Search Console account and you’ll see all the crawl errors appear. After that, you can use the redirect manager to create redirects of all 404 errors instantly. Spring cleaning, anyone?

Michiel did an excellent job explaining how you can connect Yoast SEO to Search Console and how to fix crawl errors. Read that if you want to know more about the combined power of these two killer site maintenance tools.

redirects search console yoast seo

edit redirects search console yoast seo

REGEX redirects

Not for the faint-hearted, but for the true redirect kings. That doesn’t mean you can’t learn to use it as well because you should. Making redirects with regular expressions is different because you have to determine what should happen and how it should happen. It is an incredibly powerful tool that can do crazy smart stuff and is your go-to tool if you need to do very specific or large-scale redirects.

Have Team Yoast install and configure Yoast SEO premium for you! »

Let us configure Yoast SEO for you Info WordPress redirect plugin

(The redirects manager in) Yoast SEO Premium is an excellent tool, not just as an SEO tool but as a site maintenance tool as well. But don’t just take our word for it. As writer Jody Lee Cates told us:

“I hesitated to pay for Yoast Premium because I am a new blogger without much income yet. But I’m so, so happy I did! The time the redirect manager is saving me is priceless! And it’s giving me the freedom to change URL’s to improve SEO without worrying about creating redirects on my own.”

How’s that for an endorsement?

Read more: ‘Why every website needs Yoast SEO’ »

http://ift.tt/2AqFqmI

Google Vince and Venice: How These Two Updates Changed the SEO Landscape

Most of today’s SEO strategies and techniques have been done to address the updates made by Google. Every now and then, Google would announce updates and tweaks into their metrics and algorithms, to which SEO professionals would respond accordingly. These updates were made to optimize Google’s performance, and ensure that users are getting the best information possible.

While most of these updates would only mean minor tweaks into your SEO campaign, major updates, like Google Vince and Google Venice, have caused major changes that can still be felt today. Here are how these two updates changed the SEO landscape to what it is today.

Google Venice: Going local

Before this update went online, the method that you have to use to track your local search listings is through Google Places. While this was handy method of going through local search results, there are occasions where some search results which add local listings, despite the search terms not asking for it.

This changed in 2012, with the new Google Venice update. This allows your search results to show some local listings based on your IP address, or even your physical location. This allows you to be able to look for the closest businesses and establishments near your current location, which comes in handy when you are travelling or planning a place for a meeting or an event.

This has helped local SEO in a big way, as local businesses are now able to gain more traffic and visibility, which helps them compete with larger and more established businesses. Google Venice also made using Google My Business much more important, as local businesses are able to track their local SEO and SEM through this effective free tool.

Google Vince: Ranking the Big Brands

There are times when even the smallest of changes can have large implications, and this can be evident when it comes to SEO. Google Vince is one of these kinds of updates, as it was able to help the bigger businesses rank up better in search results pages. While this may have been considered by Google to be a minor update, it has caused such an impact that it affected the SEO industry.

This update was meant to improve the search quality of Google itself, and further enforce the value of branding in the internet. This benefited some of the biggest companies, along with government sites, which are seen to have more authority and quality information compared to smaller websites. Despite the apparent dominance of  these brands in search results, this prompted other websites to improve their quality, and create content that is trustworthy and high quality.

The impact of both updates

Looking back, it seems that these two somewhat simple updates have had a significant impact on how search results are being shown on Google. This has helped businesses gain more footing in search rankings, and has helped create websites with better overall quality, and keep harmful and misleading websites in check. In short, these updates make SEO for businesses that much more competitive.

Key Takeaway

Google’s updates have helped create a better search engine that would be able to provide the best results. Google Vince and Google Venice were two such updates that helped establish business websites as pages that have the relevance and authority to be at the top of search rankings.

If you have any questions or inquiries about Google’s latest updates, and SEO in general, leave a comment below and let’s talk.

http://ift.tt/2iqIatk

Semantic Keyword Research and Topic Models

Seeing Meaning

I went to the Pubcon 2017 Conference this week in Las Vegas Nevada and gave a presentation about Semantic Search topics based upon white papers and patents from Google. My focus was on things such as Context Vectors and Phrase-Based Indexing.

I promised in social media that I would post the presentation on my blog so that I could answer questions if anyone had any.

I’ve been doing keyword research like this for years, where I’ve looked at other pages that rank well for keyword terms that I want to use, and identify phrases and terms that tend to appear upon those pages, and include them on pages that I am trying to optimize. It made a lot of sense to start doing that after reading about phrase based indexing in 2005 and later.

Some of the terms I see when I search for Semantic Keyword Research include such things as “improve your rankings,” and “conducting keyword research” and “smarter content.” I’m seeing phrases that I’m not a fan of such as “LSI Keywords” which has as much scientific credibility as Keyword Density, which is next to none. There were researchers from Bell Labs, in 1990, who wrote a white paper about Latent Semantic Indexing, which was something that was used with small (less than 10,000 documents) and static collections of documents (the web is constantly changing and hasn’t been that small for a long time.)

There are many people who call themselves SEOs who tout LSI keywords as being keywords that are based upon having related meanings to other words, unfortunately, that has nothing to do with the LSI that was developed in 1990.

If you are going to present research or theories about things such as LSI, it really pays to do a little research first. Here’s my presentation. It includes links to patents and white papers that the ideas within in are based upon. I do look forward to questions.

Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana http://ift.tt/2zwOENm

Does Tomorrow Deliver Topical Search Results at Google?

The Oldest Pepper Tree in California

At one point in time, search engines such as Google learned about topics on the Web from sources such as Yahoo! and the Open Directory Project, which provided categories of sites, within directories that people could skim through to find something that they might be interested in.

Those listings of categories included hierarchical topics and subtopics; but they were managed by human beings and both directories have closed down.

In addition to learning about categories and topics from such places, search engines used to use such sources to do focused crawls of the web, to make sure that they were indexing as wide a range of topics as they could.

It’s possible that we are seeing those sites replaced by sources such as Wikipedia and Wikidata and Google’s Knowledge Graph and the Microsoft Concept Graph.

Last year, I wrote a post called, Google Patents Context Vectors to Improve Search. It focused upon a Google patent titled User-context-based search engine.

In that patent we learned that Google was using information from knowledge bases (sources such as Yahoo Finance, IMDB, Wikipedia, and other data-rich and well organized places) to learn about words that may have more than one meaning.

An example from that patent was that the word “horse” has different meanings in different contexts.

To an equestrian, a horse is an animal. To a carpenter, a horse is a work tool when they do carpentry. To a gymnast, a horse is a piece of equipment that they perform manuevers upon during competitions with other gymnasts.

A context vector takes these different meanings from knowledge bases, and the number of times they are mentioned in those places to catalogue how often they are used in which context.

I thought knowing about context vectors was useful for doing keyword research, but I was excited to see another patent from Google appear where the word “context” played a featured role in the patent. When you search for something such as a “horse”, the search results you recieve are going to be mixed with horses of different types, depending upon the meaning. As this new patent tells us about such search results:

The ranked list of search results may include search results associated with a topic that the user does not find useful and/or did not intend to be included within the ranked list of search results.

If I was searching for a horse of the animal type, I might include another word in my query that identified the context of my search better. The inventors of this new patent seem to have a similar idea. The patent mentions

In yet another possible implementation, a system may include one or more server devices to receive a search query and context information associated with a document identified by the client; obtain search results based on the search query, the search results identifying documents relevant to the search query; analyze the context information to identify content; and generate a group of first scores for a hierarchy of topics, each first score, of the group of first scores, corresponding to a respective measure of relevance of each topic, of the hierarchy of topics, to the content.

From the pictures that accompany the patent it looks like this context information is in the form of Headings that appear above each search result that identify Context information that those results fit within. Here’s a drawing from the patent showing off topical search results (showing rock/music and geology/rocks):

Search Results in ContextDifferent types of ‘rock’ on a search for ‘rock’ at Google

This patent does remind me of the context vector patent, and the two processes in these two patents look like they could work together. This patent is:

Context-based filtering of search results
Inventors: Sarveshwar Duddu, Kuntal Loya, Minh Tue Vo Thanh and Thorsten Brants
Assignee: Google Inc.
US Patent: 9,779,139
Granted: October 3, 2017
Filed: March 15, 2016

Abstract

A server is configured to receive, from a client, a query and context information associated with a document; obtain search results, based on the query, that identify documents relevant to the query; analyze the context information to identify content; generate first scores for a hierarchy of topics, that correspond to measures of relevance of the topics to the content; select a topic that is most relevant to the context information when the topic is associated with a greatest first score; generate second scores for the search results that correspond to measures of relevance, of the search results, to the topic; select one or more of the search results as being most relevant to the topic when the search results are associated with one or more greatest second scores; generate a search result document that includes the selected search results; and send, to a client, the search result document.

It will be exciting to see topical search results start appearing at Google.

Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana http://ift.tt/2yj3G6I

Using Ngram Phrase Models to Generate Site Quality Scores

Scrabble-phrasesSource: http://ift.tt/1JJOg9H
Photographer: McGeddon
Creative Commons License: Attribution 2.0 Generic

Navneet Panda, whom the Google Panda update is named after, has co-invented a new patent that focuses on site quality scores. It’s worth studying to understand how it determines the quality of sites.

Back in 2013, I wrote the post Google Scoring Gibberish Content to Demote Pages in Rankings, about Google using ngrams from sites and building language models from them to determine if those sites were filled with gibberish, or spammy content. I was reminded of that post when I read this patent.

Rather than explaining what ngrams are in this post (which I did in the gibberish post), I’m going to point to an example of ngrams at the Google n-gram viewer, which shows Google indexing phrases in scanned books. This article published by the Wired site also focused upon ngrams: The Pitfalls of Using Google Ngram to Study Language.

An ngram phrase could be a 2-gram, a 3-gram, a 4-gram, or a 5-gram phrase; where pages are broken down into two-word phrases, three-word phrases, four-word phrases, or 5 word phrases. If a body of pages are broken down into ngrams, they could be used to create language models or phrase models to compare to other pages.

Language models, like the ones that Google used to create gibberish scores for sites could also be used to determine the quality of sites, if example sites were used to generate those language models. That seems to be the idea behind the new patent granted this week. The summary section of the patent tells us about this use of the process it describes and protects:

In general, one innovative aspect of the subject matter described in this specification can be embodied in methods that include the actions of obtaining baseline site quality scores for a plurality of previously-stored sites; generating a phrase model for a plurality of sites including the plurality of previously-scored sites, wherein the phrase model defines a mapping from phrase-specific relative frequency measures to phrase-specific baseline site quality scores; for a new site, the new site not being one of the plurality of previously-scored sites, obtaining a relative frequency measure for each of a plurality of phrases in the new site; determining an aggregate site quality score for the new site from the phrase model using the relative frequency measures of the plurality of phrases in the new site; and determining a predicted site quality score for the new site from the aggregate site quality score.

The newly granted patent from Google is:

Predicting site quality
Inventors: Navneet Panda and Yun Zhou
Assignee: Google
US Patent: 9,767,157
Granted: September 19, 2017
Filed: March 15, 2013

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for predicating a measure of quality for a site, e.g., a web site. In some implementations, the methods include obtaining baseline site quality scores for multiple previously scored sites; generating a phrase model for multiple sites including the previously scored sites, wherein the phrase model defines a mapping from phrase specific relative frequency measures to phrase specific baseline site quality scores; for a new site that is not one of the previously scored sites, obtaining a relative frequency measure for each of a plurality of phrases in the new site; determining an aggregate site quality score for the new site from the phrase model using the relative frequency measures of phrases in the new site; and determining a predicted site quality score for the new site from the aggregate site quality score.

In addition to generating ngrams from text upon sites, in some versions of the implementation of this patent will include generating ngrams from anchor text of links pointing to pages of the sites. Building a phrase model involves calculating the frequency of n-grams on a site “based on the count of pages divided by the number of pages on the site.”

The patent tells us that site quality scores can impact rankings of pages from those sites, according to the patent:

Obtain baseline site quality scores for a number of previously-scored sites. The baseline site quality scores are scores used by the system, e.g., by a ranking engine of the system, as signals, among other signals, to rank search results. In some implementations, the baseline scores are determined by a backend process that may be expensive in terms of time or computing resources, or by a process that may not be applicable to all sites. For these or other reasons, baseline site quality scores are not available for all sites.

Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana http://ift.tt/2xs7lQW

Google’s Project Jacquard: Textile-Based Device Controls

Textile Devices with Controls Built into them

I remember my father building some innovative plastics blow molding machines where he added a central processing control device to the machines so that all adjustable controls could be changed from one place. He would have loved seeing what is going on at Google these days, and the hardware that they are working on developing, which focuses on building controls into textiles and plastics.

Outside of search efforts from Google, but it is interesting seeing what else they may get involved in since that is begining to cover a wider and wider range of things, from self-driving cars to glucose analyzing contact lenses.

This morning I tweeted an article I saw in the Sun, from the UK that was kind of interesting: Seating Plan Google’s creating touch sensitive car seats that will switch on air con, sat-nav and change music with a BUM WIGGLE

It had me curious if I could find patents related to Google’s Project Jacquard, so I went to the USPTO website, and searched, and a couple came up.

Attaching Electronic Components to Interactive Textiles
Inventors: Karen Elizabeth Robinson, Nan-Wei Gong, Mustafa Emre Karagozler, Ivan Poupyrev
Assignee: Google
US Patent Application: 20170232538
Granted: August 17, 2017
Filed: May 3, 2017

Abstract

This document describes techniques and apparatuses for attaching electronic components to interactive textiles. In various implementations, an interactive textile that includes conductive thread woven into the interactive textile is received. The conductive thread includes a conductive wire (e.g., a copper wire) that that is twisted, braided, or wrapped with one or more flexible threads (e.g., polyester or cotton threads). A fabric stripping process is applied to the interactive textile to strip away fabric of the interactive textile and the flexible threads to expose the conductive wire in a window of the interactive textile. After exposing the conductive wires in the window of the interactive textile, an electronic component (e.g., a flexible circuit board) is attached to the exposed conductive wire of the conductive thread in the window of the interactive textile.

Interactive Textiles
Inventors: Ivan Poupyrev
Assignee: Google Inc.
US Patent Application: 20170115777
Granted: April 27, 2017
Filed: January 4, 2017

Abstract

This document describes interactive textiles. An interactive textile includes a grid of conductive thread woven into the interactive textile to form a capacitive touch sensor that is configured to detect touch input. The interactive textile can process the touch-input to generate touch data that is useable to control various remote devices. For example, the interactive textiles may aid users in controlling volume on a stereo, pausing a movie playing on a television, or selecting a web page on a desktop computer. Due to the flexibility of textiles, the interactive textile may be easily integrated within flexible objects, such as clothing, handbags, fabric casings, hats, and so forth. In one or more implementations, the interactive textiles may be integrated within various hard objects, such as by injection molding the interactive textile into a plastic cup, a hard casing of a smart phone, and so forth.

The drawings that accompanied this patent were interesting because they showed off how gestures used on controls might be used:

Controls in action

textile controllerHere is a look at the textile controller. double tapA double tap on the controller is possible. two finger touchA two finger touch on the controller is also possible. swipe upYou can swipe up on textile controllers extruderAn Extruder showing plastics materials being heated up to send to a mold molded devicesThe patent shows off plastic molder devices with controls built into them.

My father would have gotten a kick out of seeing a plastics extruder in a Google Patent (I know I did.)

It will be interesting seeing textile and plastics controls come out as described in these patents.

Added 9/25/2017: Saw this news this morning: This Levi’s jacket with a smart sleeve is finally going on sale for $350

Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana http://ift.tt/2wczHjd

Citations behind the Google Brain Word Vector Approach

Cardiff-Tidal-pools

In October of 2015, a new algorithm was announced by members of the Google Brain team, described in this post from Search Engine Land – Meet RankBrain: The Artificial Intelligence That’s Now Processing Google Search Results One of the Google Brain team members who gave Bloomberg News a long interview on Rankbrain, Gregory S. Corrado was a co-inventor on a patent that was granted this August along with other members of the Google Brain team.

In the SEM Post article, RankBrain: Everything We Know About Google’s AI Algorithm we are told that Rankbrain uses concepts from Geoffrey Hinton, involving Thought Vectors. The summary in the description from the patent tells us about how a word vector approach might be used in such a system:

Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. Unknown words in sequences of words can be effectively predicted if the surrounding words are known. Words surrounding a known word in a sequence of words can be effectively predicted. Numerical representations of words in a vocabulary of words can be easily and effectively generated. The numerical representations can reveal semantic and syntactic similarities and relationships between the words that they represent.

By using a word prediction system having a two-layer architecture and by parallelizing the training process, the word prediction system can be can be effectively trained on very large word corpuses, e.g., corpuses that contain on the order of 200 billion words, resulting in higher quality numeric representations than those that are obtained by training systems on relatively smaller word corpuses. Further, words can be represented in very high-dimensional spaces, e.g., spaces that have on the order of 1000 dimensions, resulting in higher quality representations than when words are represented in relatively lower-dimensional spaces. Additionally, the time required to train the word prediction system can be greatly reduced.

So, an incomplete or ambiguous query that contains some words could use those words to predict missing words that may be related. Those predicted words could then be used to return search results that the original words might have difficulties returning. The patent that describes this prediction process is:

Computing numeric representations of words in a high-dimensional space

Inventors: Tomas Mikolov, Kai Chen, Gregory S. Corrado and Jeffrey A. Dean
Assignee: Google Inc.
US Patent: 9,740,680
Granted: August 22, 2017
Filed: May 18, 2015

Abstract

Methods, systems, and apparatus, including computer programs encoded on computer storage media, for computing numeric representations of words. One of the methods includes obtaining a set of training data, wherein the set of training data comprises sequences of words; training a classifier and an embedding function on the set of training data, wherein training the embedding function comprises obtained trained values of the embedding function parameters; processing each word in the vocabulary using the embedding function in accordance with the trained values of the embedding function parameters to generate a respective numerical representation of each word in the vocabulary in the high-dimensional space; and associating each word in the vocabulary with the respective numeric representation of the word in the high-dimensional space.

One of the things that I found really interesting about this patent was that it includes a number of citations from the applicants for the patent. They looked worth reading, and many of them were co-authored by inventors of this patent, by people who are well-known in the field of artificial intelligence, or by people from Google. When I saw them, I started hunting for locations on the Web for them, and I was able to find copies of them. I will be reading through them and thought it would be helpful to share those links; which was the idea behind this post. It may be helpful to read as many of these as possible before tackling this patent. If anything stands out in any way to you, let us know what you’ve found interesting.

Bengio and LeCun, “Scaling learning algorithms towards AI,” Large-Scale Kernel Machines, MIT Press, 41 pages, 2007. cited by applicant.

Bengio et al., “A neural probabilistic language model,” Journal of Machine Learning Research, 3:1137-1155, 2003. cited by applicant .

Brants et al., “Large language models in machine translation,” Proceedings of the Joint Conference on Empirical Methods in Natural Language Processing and Computational Language Learning, 10 pages, 2007. cited by applicant .

Collobert and Weston, “A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning,” International Conference on Machine Learning, ICML, 8 pages, 2008. cited by applicant .

Collobert et al., “Natural Language Processing (Almost) from Scratch,” Journal of Machine Learning Research, 12:2493-2537, 2011. cited by applicant .

Dean et al., “Large Scale Distributed Deep Networks,” Neural Information Processing Systems Conference, 9 pages, 2012. cited by applicant .

Elman, “Finding Structure in Time,” Cognitive Science, 14, 179-211, 1990. cited by applicant .

Huang et al Improving Word Representations via Global Context and Multiple Word Prototypes,” Proc. Association for Computational Linguistics, 10 pages, 2012. cited by applicant .

Mikolov and Zweig, “Linguistic Regularities in Continuous Space Word Representations,” submitted to NAACL HLT, 6 pages, 2012. cited by applicant .

Mikolov et al., “Empirical Evaluation and Combination of Advanced Language Modeling Techniques,” Proceedings of Interspeech, 4 pages, 2011. cited by applicant .

Mikolov et al., “Extensions of recurrent neural network language model,” IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 5528-5531, May 22-27, 2011. cited by applicant .

Mikolov et al., “Neural network based language models for highly inflective languages,” Proc. ICASSP, 4 pages, 2009. cited by applicant .

Mikolov et al., “Recurrent neural network based language model,” Proceedings of Interspeech, 4 pages, 2010. cited by applicant .

Mikolov et al., “Strategies for Training Large Scale Neural Network Language Models,” Proc. Automatic Speech Recognition and Understanding, 6 pages, 2011. cited by applicant .

Mikolov, “RNNLM Toolkit,” Faculty of Information Technology (FIT) of Brno University of Technology [online], 2010-2012 [retrieved on Jun. 16, 2014]. Retrieved from the Internet: , 3 pages. cited by applicant .

Mikolov, “Statistical Language Models based on Neural Networks,” PhD thesis, Brno University of Technology, 133 pages, 2012. cited by applicant .

Mnih and Hinton, “A Scalable Hierarchical Distributed Language Model,” Advances in Neural Information Processing Systems 21, MIT Press, 8 pages, 2009. cited by applicant .

Morin and Bengio, “Hierarchical Probabilistic Neural Network Language Model,” AISTATS, 7 pages, 2005. cited by applicant .

Rumelhart et al., “Learning representations by back-propagating errors,” Nature, 323:533-536, 1986. cited by applicant .

Turian et al., “MetaOptimize / projects / wordreprs /” Metaoptimize.com [online], captured on Mar. 7, 2012. Retrieved from the Internet using the Wayback Machine: , 2 pages. cited by applicant .
Turlan et al., “Word Representations: A Simple and General Method for Semi-Supervised Learning,” Proc. Association for Computational Linguistics, 384-394, 2010. cited by applicant .

Turney, “Measuring Semantic Similarity by Latent Relational Analysis,” Proc. International Joint Conference on Artificial Intelligence, 6 pages, 2005. cited by applicant .

Zweig and Burges, “The Microsoft Research Sentence Completion Challenge,” Microsoft Research Technical Report MSR-TR-2011-129, 7 pages, Feb. 20, 2011. cited by applicant.

Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana http://ift.tt/2eMALj6

Personalizing Search Results at Google

document sets at Google

One thing most SEOs are aware of is that search results at Google are sometimes personalized for searchers; but it’s not something that I’ve seen too much written about. So when I came across a patent that is about personalizing search results, I wanted to dig in, and see if it could give us more insights.

The patent was an updated continuation patent, and I love to look at those, because it is possible to compare changes to claims from an older version, to see if they can provide some details of how processes described in those patents have changed. Sometimes changes are spelled out in great detail, and sometimes they focus upon different concepts that might be in the original version of the patent, but weren’t necessarily focused upon so much.

One of the last continuation patents I looked at was one from Navneet Panda, in the post, Click a Panda: High Quality Search Results based on Repeat Clicks and Visit Duration In that one, we saw a shift in focus to involve more user behavior data such as repeat clicks by the same user on a site, and the duration of a visit to a site.

Personalizing search results
Inventors: Paul Tucker
Assignee: GOOGLE INC.
US Patent: 9,734,211
Granted: August 15, 2017
Filed: February 27, 2015

Abstract

A system receives a search query from a user and performs a search of a corpus of documents, based on the search query, to form a ranked set of search results. The system re-ranks the set of search results based on preferences of the user, or a group of users, and provides the re-ranked search results to the user.

The older version of the patent is Personalizing search results, which was filed on September 16, 2013, and was granted on March 10, 2015.

A continuation patent has claims rewritten on it, that reflect changes in how a process that has been patented might have changed, using the filing date of the original version of the patent.

I like comparing the claims, since that is what usually changes in continuation patents. I noticed some significant changes from the older version to this newer version.

There is a lot more emphasis on “high quality” sites and “distrusted sites” in the new version of the patent, which can be seen in the first claim of the patent. It’s worth putting the old and the new first claim one after the other, and comparing the two.

The Old First Claim

1. A method comprising: identifying, by at least one of one or more server devices, a first set of documents associated with a user, documents, in the first set of documents, being assigned weights that reflect a relative quantification of an interest of the user in the documents in the first set of documents; receiving, by at least one of the one or more server devices, a search query from a client device associated with the user; identifying, by at least one of the one or more server devices and based on the search query, a second set of documents, each document from the second set of documents having a respective score; determining, by at least one of the one or more server devices, that a particular document, from the second set of documents, matches or links to one of the documents in the first set of documents; adjusting, by at least one of the one or more server devices, the respective score of the particular document, to form an adjusted score, based on the weight assigned to the one of the documents in the first set of documents; forming, by at least one of the one or more server devices, a list of documents in which documents from the second set of documents are ranked based on the respective scores, the particular document being ranked in the list based on the adjusted score; and providing, by at least one of the one or more server devices, the list of documents to the client device.

The New First Claim

This is newly granted this week:

1. A method, comprising: determining, by at least one of one or more server devices, preferences of a user or a group of users, wherein the preferences indicate a document bias set and weights assigned to the documents, wherein the weights include distrusted document weights; determining, by the at least one of the one or more server devices, a high quality document set obtained from a document ranking algorithm; creating, by at least one of the one or more server devices, an intersection set of documents which includes documents in both the document bias set and the high quality document set; receiving, by at least one of the one or more server devices, a search query from the user; performing, by at least one of the one or more server devices, a search of a corpus of documents, based on the search query, to form a ranked set of search result documents; determining, by at least one of the one or more server devices, at least one link from the intersection set of documents to at least one document in the ranked set of search result documents, the at least one document not in the intersection set of documents; re-ranking, by at least one of the one or more server devices, the set of search result documents based on the preferences of the user or the group of users, wherein re-ranking the set of search results comprises: identifying a link of the set of links from the intersection set of documents to the document of the set of search result documents, and based on identifying the link, adjusting a rank of the search result document based on the weight assigned to the document in the document bias set from where the identified link originated from; and providing, by at least one of the one or more server devices, the re-ranked search results to the user.

The changes I am seeing in these two different first claims involve what are being called “distrusted document weights” from a “document bias set”, and showing pages from “a high quality document set.” The newer claim makes it more clear that personalized results come from these two different sets of results. It’s possible that it doesn’t change how personalization actually works, but the increased clarity is good to see.

The Purpose of these Personalizing Search Results Patents

We are told that some sites are favored more than others, and some are disliked more than others, and those are are created from a query or browser history, to generate a document bias set:

FIG. 1 illustrates an overview of the re-ranking of search results based on a user’s or group’s document or site preferences. In accordance with this aspect of the invention, a document bias set F 105 may be generated that indicates the user’s or group’s preferred and/or disfavored documents. Bias set F 105 may be automatically collected from a query or browser history of a user. Bias set F 105 may also be generated by human compilation, or editing of an automatically generated set. Bias set F 105 may include a set of documents shared, or developed, by a group that may further include a community of users of common interest. Document bias set F 105 may include one or more designated documents (e.g., documents a, b, x, y and z) with associated weights (e.g. w.sup.a.sub.F, w.sup.b.sub.F, w.sup.x.sub.F, w.sup.y.sub.F and w.sup.z.sub.F). The weights may be assigned to each document (e.g., documents a, b, x, y and z) based on a user’s, or group’s, relative preferences among documents of bias set F 105. For example, bias set F 105 may include a user’s personal most-respected, or most-distrusted, document list, with the weights being assigned to each document in bias set F 105 based on a relative quantification of the user’s preference among each of the documents of the set.

This document bias set mention appears in both the older, and the newer version of the patent.

The patents also both refer to a high quality document set, and that is described in a way that seems to place a lot of attention on PageRank or a Hubs and Authority approach to ranking:

A high quality document set L 110 may be obtained from any existing document ranking algorithm. Such document ranking algorithms may include a link-based ranking algorithm, such as, for example, Google’s PageRank algorithm, or Kleinberg’s Hubs and Authorities ranking algorithm. The document ranking algorithm may provide a global ranking of document quality that may be used for ranking the results of searches performed by search engines. High quality document set L 110 may be derived from the highest-ranking documents in the web as ranked by an existing document ranking algorithm. In one implementation, for example, set L 110 may include the top percentage of the documents globally ranked by an existing document ranking algorithm (e.g., the highest ranked 20% of documents). In an implementation using PageRank, set L 110 may include documents having PageRank scores higher than a threshold value (e.g., documents with PageRank scores higher than 10,000,000). Set L 110 may include multiple documents (e.g., documents m, n, o, p, x, y and z) with associated weights (e.g., weights W.sup.m.sub.L, W.sup.n.sub.L, W.sup.o.sub.L, W.sup.p.sub.L, W.sup.x.sub.L, W.sup.y.sub.L and W.sup.Z.sub.L). The weights may be assigned to each document (e.g., documents m, n, o, p, x, y and z) based on a relative ranking of “quality” between the different documents of set L 110 produced by the document ranking algorithm.

Personalized results served to a searcher are results that come from both the document bias set, and the high quality document set (as the patent says, from an “intersection” between the two sets).

If you are interested in how personalized search may work at Google, spending some time with this new patent may provide some insights. Knowing about how two different sets of documents are involved in returning results is a good starting point.

Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana http://ift.tt/2v3hMWE

Wednesday, November 29, 2017

Seer’s 2018 Office Dogs Calendar Highlights Local Shelters

Modern workspaces have paved the way for incredible things: collaboration in open floor plans, healthier ways to work with standing desks, and, my personal favorite: office dogs.

We are fortunate to be working in a day and age where if you land a job at a forward-thinking company, you might have the one standing office dog or even celebrate the adorable Bring-Your-Dog to-Work Day.…

http://ift.tt/2il184F

How Do I Rank Higher in Google Local Search? Our Checklist for Local SEO

How Do I Rank Higher in Google Local Search? Our Checklist for Local SEO was originally published on BruceClay.com, home of expert search engine optimization tips.

The good news: Showing up in Google’s search engine can be extremely beneficial to your local business.

The bad news: Google doesn’t care if you rank high or low. It cares only that there are quality results that answer the query to the total satisfaction of the searcher.

So the pressing question is, how do you rank higher on Google Maps and Google local search results? Improving your local search rankings is possible, and the results are very real. A Google study found that:

  • 4 in 5 consumers use search engines to find local information.
  • 50 percent of local smartphone searches lead to a store visit in less than a day.
  • 18 percent of local searches on a smartphone result in a sale within a day.

If you’re asking, “How does Google local search work, and how can I rank higher in local search?” then this local SEO checklist is for you!

I first presented these 41 factors that contribute to local search ranking at Pubcon earlier this month (see my slide deck at the end). Here, I list them executive summary style, to help you understand how you can increase your local search rankings.

Disclaimers: Each of these topics could be an article in itself, but I’ve tried to give brief explanations and links for further reading, in keeping with a list format. This is not an exhaustive list of local ranking factors. It’s not in priority order either, but grouped into general categories which you can jump to as follows:

Housekeeping Signals

1. Branding
Being a respected business in your community will increase your local search visibility. Google pays a lot of attention a brand’s perceived trust and expertise. Even if you’re just starting out, aim for happy customers and consistent quality to attract traffic and mentions.

2. Domain name
Your website’s name should accurately represent your business or brand. It’ll be in every URL, so make it something appropriate and easily remembered. Don’t use a keyword phrase alone (e.g., http://ift.tt/2kaBMXA) to avoid an exact match domain (EMD) penalty. On the other hand, including a keyword as part of your domain (e.g., http://ift.tt/2kaBNuC) can help you as a local business if it’s tied into your brand name. Search algorithms are getting better and better at weeding out low-quality results, so make sure your domain doesn’t look like spam.

3. Hosting
When it comes to web hosting, think about speed, availability, and maintained software. Choose a host that ensures your content is served up quickly, since page load speed is now a factor in Google’s algorithm. Beyond the hosting platform, there are many ways to speed up your web pages. Using Accelerated Mobile Pages and/or Progressive Web Apps may be worth considering, as well.

4. Content management system (CMS)
Above all else, your CMS should be easy to use. Here, WordPress is king, consistently the top CMS used on the web. Consider how you can improve your system’s functionality with plugins — WordPress.org lists 1,864 plugins for “local” alone. And, don’t forget about a WordPress SEO plugin, too.

5. Compatibility
We’re in a mobile-first world, with the majority of searches happening on smartphones and Google evaluating sites based on their mobile friendliness. Check your site to make sure it’s mobile friendly and optimized for mobile devices — otherwise, your rankings and visitor counts will suffer. Voice search is the next big area of compatibility.

6. Email
Use your business’s domain in your email address (@bruceclay.com) rather than @gmail or another generic provider. It’s a small point, but worth putting on the housekeeping checklist to increase your professionalism and perceived trustworthiness.

Keywords and Content Signals

7. Keyword and content gap analysis
Identify the keywords working for you in terms of hitting key performance indicators and bringing in revenue. Use keyword research to find additional phrases that can serve your personas/community, and examine your competition online for their keywords. Wherever you find a gap in your own content compared to the top-ranking sites, expand accordingly.

8. Detailed competitive review
To get a more in-depth look at your competition, you’ll need to perform a detailed review. Examine their performance in every area in this checklist, then outdo them. The goal is to be the least imperfect with your local SEO.

9. Content creation
Content that informs, educates or entertains readers improves your engagement. We recommend siloing your web content based on the themes your business is about. Set up your navigation and internal links carefully to create a hierarchical structure for the content on your site. Doing so will strengthen your site’s relevance and expertise around those topics.

10. Content variety
Many different types of content can be “localized” to pertain specifically to your community. The list includes images, news, events, blog posts, videos, ads, tools and more. Having a variety of types of content indexed also gives your site more opportunity to rank, since they can appear in the vertical search engines (e.g., Google Images, YouTube, etc.).

Local content types diagram by Mike Ramsay

Local content types diagram by Mike Ramsey

11. Content creation strategies
To establish yourself as a local authority, tell local stories and express your opinion about the topics your business and your customers are focused on. Excellent content can become a strategy for attracting search traffic and also local expert links.

12. Local videos
When you create videos that are appropriate to your website and region, you’ll soon discover that people will share them more on a local level. Build landing pages for your videos on your site to attract links and mentions. You can do this by uploading a video to your YouTube channel first, then embedding it on your page (copy the HTML right from YouTube’s Share tab into your page’s code).

13. Long-tail rankings
Use locally relevant content to rank higher in searches around the Local Pack. Examples would include posts like “The 5 Best Restaurants in Las Vegas,” which could answer long-tail queries such as, “What are the best restaurants in Las Vegas.”

14. Local relevance
Having content that’s locally focused can improve your reputation and reach in your area. This requires more than doing a find-and-replace on the city name to create hundreds of basically duplicate pages. You can start with templates, but make sure you’re including enough customized text, images and data to be locally relevant.

15. Landing pages
For the best local results, create optimal landing pages. For example, if your brand serves a wide region, you might have a different landing page for each city in that region, like “dog grooming Simi Valley” and “dog grooming Thousand Oaks.”

16. Schema NAP+W
Schema markup is code you can add to your website to help search engines understand your various types of information. According to Searchmetrics, pages with schema markup rank an average of four positions higher in search results.

Local businesses need schema in particular to call out their name, address, phone and website URL, also known as NAP+W, as well as hours of operation and much more. As an example, here’s what schema for our NAP+W would look like in the page code:

Local business schema markup example

Local business schema markup example (in Google’s preferred format, JSON-LD)

Google is planning to expand its use of schema, so be sure to take advantage of all the structured data that applies to your content. Check out Google’s Structured Data Testing Tool to confirm you’re implementing schema correctly.

17. Information in the Local Pack
Search engines want to make sure local business information is valid before presenting it in the “Local Pack” (the handful of local listings Google displays at the top of a web search results page, with addresses and a map). A business’s proximity to the searcher heavily influences whether it shows up in Local Pack results, so your location matters.

Keep your NAP+W data consistent across all sources. This is a local SEO priority, as it improves the search engines’ confidence in your business listing’s accuracy.

Be sure to include your business address on your own website. You can do this in the footer so it appears on every page, or at least show it on your contact page.

18. Google Map embedded
By adding a Google map to your contact page or footer, you can quickly show searchers and search engines exactly where you’re located. Using an embedded map rather than a static map image provides extra functionality and reduces friction — a human visitor can just click the map and grab directions. On our site, the embedded map shows in the footer when a user clicks [Location & Hour Information]:

Embedded Google Map on BruceClay.com

Embed a Google Map to add an interactive element to your site.

19. Testimonials
To boost your brand’s credibility, you’ll need to get some local reviews or testimonials. Earn them (here’s a list of SEO-approved ways to get local reviews) and then add them, localized and with the author identified whenever possible. Testimonials, especially on a local level, can have a big impact. Seventy-three percent of consumers say that positive reviews make them trust a local business more.

20. Hawk update
In the past, Google filtered out local listings that shared the same address or that were on the same street (part of the Possum update) under the assumption that those sites belonged to one business trying to get multiple listings. But since the August 2017 Hawk update, Google seems to have removed “same address” filtering for organic ranking.

On-Page Signals

21. Technical on-page SEO
On-page elements are critical to get right for organic SEO on any web page. In addition to the standard optimization items (see our always-up-to-date SEO checklist for a list), a locally targeted page should have:

  • City in the title tag
  • Schema markup (as appropriate to the page contents)
  • Do not stuff keywords
  • Do not simply find-and-replace city names
  • Appropriate reading level and complexity (compare top-ranking pages to find your sweet spot)

22. Local keyword optimization
Be sure to mention local keywords on your web pages (such as the name of your city, state or region and other geographical/local references) to help solidify Google’s understanding of your location and help you rank for local keyword queries.

Linking Signals

23. Local link building
You cannot rank in a city without having local links. When relevant, quality websites within your city link back to you, it shows you’re a trusted local brand. Only links coming from unique IPs, unique domains and unique WhoIs for your geographic area will help you rank, so don’t fall for link schemes. The anchor text (clickable text) used in the links also send a signal to search engines. (See more link building guidelines.)

24. Local directories
To make it easier for searchers to find you, you’ll want to be included in geotargeted directories for services, such as Yellow Pages online, a local restaurant database, or other. These citations add more weight to your site in the local search ranking algorithms. (This interview with local expert Darren Shaw gives helpful information on local listings, including a directory list.)

25. Social and web mentions
Are people talking about your brand online? Even if they don’t include a link, brand mentions on social media platforms show engagement and interest in your business. These linkless mentions (and also “nofollow” links) help your business by attracting new customers and reinforcing your brand’s reputation, which can even influence local search rankings. Use a tool like GeoRanker to identify local citations and social media tools to keep tabs on the conversation.

26. External links
Boost your credibility by linking to local expert resources that would be useful to your site visitors. Choose external web pages that are relevant to your subject matter and region. Remember that in order to be viewed as a local expert, you should visibly network with other local experts.

27. Competitor backlinks
If someone is linking to your competition, they might link to you as well. Start by looking at the backlink profile of your top-ranked competitors (using a backlink analysis tool such as Majestic, Ahrefs or other). Identify good candidates — high-quality and relevant sites that don’t already link to yours. Then see if you can earn links from those same sites.

Local Pack Signals

28. NAP+W consistency
As mentioned earlier, NAP+W refers to your business name, address, phone number and website URL. The goal here is for your NAP+W to be consistent across the board — wherever it’s listed online. For local optimization, you don’t want to have various versions of your address and phone number out there, such as:

NAP inconsistencies per Yext tool

NAP inconsistencies identified should be fixed (via Yext)

To see if your NAP+W is consistent, try Yext’s free test.

29. Google My Business (GMB) optimization
Having a Google My Business listing is critical for businesses with service areas and physical businesses. It’s a free business listing to start building your visibility in Google Maps and Google Search.

In addition to ensuring NAP+W information is accurate, here are some optimization tips for your Google listing:

  • Add a unique description about your business. Make it long (400+ words), formatted correctly, and include links to your website.
  • Add your open business hours.
  • Select the best categories for your business (use Blumenthal’s Google Places for Business Category Tool).
  • Include a high-resolution profile cover image, plus as many additional photos as possible.
  • Use a local phone number (not a toll-free number).
  • Encourage reviews from your customers.
  • Use Google Posts to enhance your brand’s Knowledge Panel with upcoming events or special news. Your post displays only temporarily (usually for seven days), but will remain visible to anyone looking up your brand using Google mobile search, so make each post unique.

Secondly, create and optimize your business listing on Bing Places for Business.

30. Check your site on Google Maps
Your Google My Business listing and schema also help get your business to show up in Google Maps. Since navigation systems and customers may refer to Google Maps to find you, make sure the pin marks the correct location for your business. Here’s how to add or edit your site in Google Maps.

31. Local business listings
Increase your visibility by including your business on sites such as Yelp, Thomson Local, Angie’s List, Yellow Pages, TripAdvisor, Urbanspoon, OpenTable, Merchant Circle and Foursquare, as well as local travel and news sites — choose the sites that fit your type of business and customer base.

32. Better Business Bureau (BBB)
Boost your credibility by ensuring that your business is listed with the BBB. Monitor your ratings there and display your BBB rating on your website as a trust signal for visitors. As with all local directories, make sure your location information on BBB matches your NAP+W.

33. Citation building and reviews
Reviews will usually reflect absolute happiness or absolute misery. So it’s important to monitor the quantity and sentiment of your online reviews so you can actively manage your reputation.

  • Review sites to monitor include: Facebook, Google, Yelp, Bing, local chamber websites and more.
  • Sites where citations and mentions may occur include: Reddit, Quora, news media sites like WSJ, etc.
  • Consider adding a page to your website with instructions on how to provide reviews and feedback.

34. Location pages
It’s recommended that you have one or more pages on your site dedicated to each location your business is in. Dedicate a page to each keyword, for example, “real estate agent, Simi Valley” (services, then city). Design this to be a good landing page for anyone searching within that area, and make the content unique. Avoid laundry lists or simply doing a wild card replace for the city name. Search engines can spot that type of duplicate content a mile away. (See our tips for dealing with thin content on your site.)

35. Press releases
Press releases can be a great way to let locals know that you exist, especially if you have breaking news. Opening a new location? Hosting a charity event? Be sure to publicize it, and include the local geo references (city name, etc.) in your text. A press release published through an online PR site might catch the eye of a reporter who will publish a news article about your business in a local publication.

Social Signals

36. Social profiles
Being active in social media and sharing your content (think content marketing) contribute to keeping your business top-of-mind. On social media sites like Facebook, Twitter, Instagram, YouTube, Google+ and Pinterest, your profile pages matter — make them consistent with your brand voice and informative. Be sure to include your contact information. Engagement with your brand is a social signal, such as when something you’ve posted is shared or liked. It’s also a way to engage with current and potential customers.

37. Touch your followers
Help customers stay in the know. Social media can be an efficient way to spread news, local deals, alerts and updates to your customer base as well as get the word out to others. Interact with them one-on-one, and you may develop a brand advocate for life.

38. Become the local expert
Make yourself known as a trustworthy business by building local expertise and authority in your space. For example, you could teach a class or speak at a local event. Brainstorm presentations that bring value to an audience while showcasing your expert knowledge related to your business.

39. Local discounts
Attract local customers by offering discounts for locals. For example, you could offer members of a local organization $x or x% off your products or services, accept AAA discounts, or other.

Success Signals for Local SEO

40. Online and offline conversion tracking/analytics
Stay on top of your conversions — actual results and dollars earned from your website — through analytics. (If you haven’t yet, set up Google Analytics for free.) Pay..

http://ift.tt/2BxLMOG

Tuesday, November 28, 2017

Knowledge Panels Up and Featured Snippets Down: What You Need to Know

Knowledge Panels Up and Featured Snippets Down: What You Need to Know

Featured Snippets in Google have been on an upward trend during the past few years, with more websites looking to attain “position 0”, which puts you on top of the search engine results pages. This allows websites to attract more users to gain more traffic, and gain more leads. This also allows users to get the best results to their search terms, and give them the best information available.

Despite the rising trend, the end of October and early November 2017 saw a drop. This means that search terms that have featured snippets are now gone, which would definitely affect a lot of websites and their traffic. The period of October 27 to November 14 alone saw a 10% drop in searched that has featured snippets. This is a kind of drop rate that would have people in SEO concerned. During that drop, there has been an increase in knowledge panels, which is also another SEO concern that needs to be checked. Let’s get to the bottom of things and find out.

An increase in knowledge panels

While featured snippets have seen a decline, knowledge panels, meanwhile, have seen a gradual increase. During the period in which the featured snippet rates were dropping, the number of search terms with knowledge panels have increased by 14%. Knowledge panels, like the one on the image below, provide some reliable information with regards to your search terms. This can be in the form of contact details, Wikipedia links, and the like. In fact, some of the pages that have lost featured snippets gained knowledge panels, some of which provide helpful information, while some just provide generic and simple information.

Featured Snippets Drop Knowledge Panel

With the rise in knowledge panels, it seems like Google has been testing around, which results to have them being featured more prominently, and attempt to improve the quality of information that is received by the users. So far, most of the knowledge has been too simplistic in some search results, giving people what they already know and not giving more substantial details. This is something that definitely needs to be addressed in the upcoming months, as information quality is always a crucial detail that must be ironed out.

A long-term concern?

With the constant rise in featured snippets in the past few years, this development is indeed concerning. While a change in strategy is one of the best ways to keep yourself on top of search pages, this short-period drop still needs to be looked into , as it may only be temporary, and Google might be simply optimizing their search results and algorithms again, which is something that happens often.

Despite some inconsistencies, Google’s featured snippets were helpful to a lot of websites, as they are able to gain organic reach, and compete with other websites to rise to “position 0”. With the increase in knowledge panels, some websites would have more difficulties gaining that organic reach, which would mean information would come from fewer sources.

Key Takeaway

While this may be an issue for the time being, all we can simply do is to wait it out and do some daily monitoring to see if this will be an issue that will cause a change in SEO strategies and practices. This can also simply be a simple case of Google trying to change things up temporarily to see what is working or not, which means things may get reverted to how it used to be, and we’ll all be fine.

If you have any questions and inquiries about SEO, leave a comment below and let’s talk.

http://ift.tt/2nfA8Ft

Monday, November 27, 2017

Readability ranks!

Does it pay off to write a text that is nice and easy to read? Will readable content lead to higher rankings and more traffic? Is readability a ‘ranking factor’? At Yoast, we are strong believers in writing texts that are nice and easy to read. We’ve developed an entire tool to help people write readable texts. We truly believe that readability ranks. Writing readable texts will lead to higher rankings and more traffic.

Learn how to write engaging copy and how to organize it well on your site: Combine our SEO copywriting and Site structure training. »

Content SEO training bundle Info

So why is readability becoming more important? I’ll discuss two reasons why we believe readability is of increasing importance for SEO. In our view, it’ll become essential for every copywriter. The first reason why readability will gain importance is because of the growth of and focus on voice search. The second reason is because of the fact the algorithm of Google is getting better and more ‘human-like’.

Voice search becomes more dominant

Although few people use voice search, Google (and other search engines) are focused on voice. They present their results in a voice-like matter; they rank their results in a voice-like matter. The very fact that search engines want voice search to be the next big thing, makes readability so important. To understand the importance for readability in voice search, look at this video:

People searching for information with voice search could end up listening to a rather long piece of information. In the example in the video above, the search result consists of an entire paragraph. Imagine such a paragraph consisting of long sentences and containing lots of difficult words. The voice result will become impossible to understand. Google will never rank such an impossible result. Neither in the voice results, nor in the normal results.

Google will prefer understandable content, because Google is focused on voice search. Google wants voice search to become the next big thing. Whether that will happen or not, doesn’t matter for the importance of understandable, readable content. Google simply dictates the search results and the algorithm. We just have to go with it. And in this case, it’s a good thing. Writing readable content is a blessing for the reader.

Learn how to write awesome and SEO friendly articles in our SEO Copywriting training »

SEO copywriting training Info

Read more: ‘How to prepare for voice search’ »

Google’s algorithm becomes more human-like

Google has become much better at predicting what people want to read. The algorithm of Google is mimicking a human. It tries to read texts like a human being. As Google becomes more capable of understanding and scanning texts in a human-like way, the demands on the readability of texts also rise.

Google will, increasingly, assess the topic of a text the way humans assess the topic of a text. People scan through texts, read subheadings and the first sentences of paragraphs. People look for transition words in order to quickly abstract what the main conclusion of an article will be. All the things humans do while reading a text, are things Google will do. That means that the structure of your text, the way you write your paragraphs, will become increasingly important. Core sentences (the first sentence of every paragraph) will gain importance. Having a clear and logical structure in your text will be invaluable.

What will be the demands on readable content?

Understandable, readable content contains lots of transition words, as these words help people understand the connection between sentences. Content that’s easy to understand has short sentences and few difficult and long words. The structure of a text is very important. Core sentences will be the most important sentences in assessing the topic of an article or blog post. These will be demands copywriters will have to deal with. It might sound scary, but if you simply write a good text, you’ll please your readers and Google.

Keep reading: ‘The ultimate guide to content SEO’ »

http://ift.tt/2Ag4F8U

Sunday, November 26, 2017

Video backgrounds suck and should be banned from your website

“I just love those video backgrounds and we need them on our new website.” No, you don’t. “They are so engaging and set a friendly mood.” No, they don’t. “It’s an amazing new feature and it helps conversion.” No, it doesn’t. Besides that, the conversation is annoying me. Video backgrounds suck big time. Our good friend Karl Gilis of AGConsult said it perfectly: “Video backgrounds are the new sliders. They’re a distraction.” And just like sliders suck and should be banned from your website, so do video backgrounds.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin Info Why do you need that video background?

I dare to state that video backgrounds were invented by web agencies trying to convince customers of a particular design:

  • Hey, this will make you stand out!
  • Now this really sets a mood on your website, don’t you think?
  • Of course we can create that video for you at a mere x dollars extra
  • Video backgrounds will keep your visitors’ attention, so time on page goes up and that’s good for Google.

What!? You’re not maintaining that site for Google, but for your users. The second reason for video backgrounds is that we all have said at one point in time: well, that looks nice. We should have thought about it before saying that. Our customers have seen that video background as well and now they want it.

Video Backgrounds: you don't want one

This GIF file probably got your attention already, but background videos are worse, IMHO. I honestly can’t think of any additional benefit of that background video for your visitor.

Why video backgrounds suck

Think about it:

  • Video backgrounds increase loading time for a page (source: common sense).
  • They distract from the primary message/call-to-action on a page. Even if that button or whatever is bright orange and your design is monochrome, a video is distracting!
  • Video backgrounds usually use videos not hosted by you. But if it doesn’t load, you’ll get the blame anyway and your design will look like crap.
  • ‘Cover the entire screen with video and set white text on top of it as the homepage’ probably is a trend. Let’s all put an end to this trend.
  • Autoplay sucks, and how on earth would it be logical to have a start button for a background video!? A call-to-action for your background? C’mon!
  • Video backgrounds only might perhaps work on very specific landing pages in very specific niches, most of the times you’ll just hurt conversion #alwaysbetesting
  • The things that go for informative products videos do not apply to video backgrounds. It’s a different thing! Really! Sigh.
  • Why do you think successful sites on the web don’t use video backgrounds? Right!

Video backgrounds suck and should be banned from your website. Even with all the nice looking examples in this article: Dos and Don’ts for Using Background Videos on Your Website, I still think the don’ts outweigh the dos. And that post is two years old. How come we haven’t put an end to that trend yet?

And what about mobile data plans and such?

Yes! Now that we have established that you just don’t need that video background for any reason, think about how much data you will have left at the end of this month! You’ll be able to watch another episode of The Ranch while commuting.

No, seriously. On your mobile website, that video background simply makes even less sense. We talked about mobile UX in this post, and video backgrounds don’t fit in with recommendations like ‘tone it down’ and ‘optimize for speed’. It’s a bad idea. Period.

The world doesn’t need video backgrounds

We have to start somewhere to eliminate the evil that is video backgrounds. Heal the world, start at your own home. Convince your customer that there are little upsides to video backgrounds. Show them a dozen websites that either support your claims or that fail to convince the visitor to use a video background. And please, please stop recommending them. On behalf of the entire internet, I thank you.

Read more: ‘Sliders suck and should be banned from your website’ »

http://ift.tt/2BeImj7

Saturday, November 25, 2017

Search and SEO in 2018

2018 is coming soon and people are starting to ask: what’s new? What should we do to keep up with changes in search and specifically in SEO in 2018? In this post, I’ll sum up the biggest changes in our world, and what you should be working on.

The search landscape is changing

Over the last decade(s), our computers have become faster and faster, and our phones have been catching up. The iPhone X is faster than many computers people have at home. The power of the small machines we have in our hands is slowly being utilized by apps and search engines alike.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin Info

Building on that growing power of the devices in our hands, the reliability of voice recognition has been steadily increasing. It’s now at the point where, in languages like English, voice commands can be reliably used to instruct our devices to do something. One of those things we can do, is search.

Voice search changes everything

We cannot tell you how many people search with voice. Most people, for now, will not use voice search as their primary mode of searching. But: the search engines are optimizing for voice search results and have been doing that for a while now. Because the search engines are optimizing for voice results, all of search has already changed because of voice search.

The featured snippets that SEOs have been striving to get are a prime example of how voice search has changed SEO. Optimizing for these snippets requires old school SEO tactics combined with something new. You see, a featured snippet is meant to be read out loud. That’s the context in which Google’s Gary Illyes told people to read their copy out loud, early this year.

Listen to this result from Google Home for the search [what is a meta description?]:

http://ift.tt/2i4ui87

If you’ve listened to the above answer, you’ll know why readability is so important. Answers this long become very hard to listen to if they’re not well written. And even then, we still have to solve things like figure out how we can get Google to pronounce SEO as S-E-O instead of “Seeoo”.

Google changes

Besides voice search and Google’s focus on that, more is changing in and for Google. Specifically: a few new technologies and a profound new way of looking at the web.

Mobile first indexing

We’ve written about mobile first indexing before, but the basic idea is simple: Google is changing how it looks at your site. From ‘judging’ your site as though it’s a desktop site, it’ll switch to judging your site as a mobile site. Every bit of content that can’t be reached on your mobile site, will not count for your ranking.

It’s still unclear when this will roll out and how fast this will roll out. Google says they’re already testing it, but they also say that sites that aren’t ready for it shouldn’t be hurt, for now. Regardless of that, your site should be working well and fast on mobile, so if it isn’t, that’s going to be your priority for SEO in 2018.

AMP

If you haven’t heard about AMP, you’ve missed quite a few posts on this site. I’d suggest you start reading here to learn what AMP is and why it’s important.

Google is focusing a lot of time and effort on AMP. So much that one of the projects we’ve got planned at Yoast for 2018 is to see if we can recreate our single post pages entirely in AMP, completely leaving the non-AMP version. Yes, that’s how important we think AMP will become in the long run. I don’t expect normal sites have to do anything that drastic in 2018, but do make sure you keep up to date with the latest news on AMP.

Want rich snippets for your site? Try our Structured data training »

Structured data training Info Structured data: JSON+LD & Schema.org

Alongside AMP, Google is pushing more and more structured data ‘profiles’. By asking webmasters and SEOs to mark up their content in structured data, according to schema.org data structures, Google is trying to understand the web better.

Yoast SEO does a good chunk of work for websites adding structured data to sites already. For most small business websites and blogs, what it does should be enough.

But if you have a site that has a lot of content that fits one of the schema.org data types (think of recipes, reviews, products, etc.), I’d highly suggest following our Structured Data course. After that you’ll know how to set up a properly structured data strategy for your site.

Content is still king

While all of the technical changes above are important to SEO in 2018, and you should definitely keep an eye on them, content is still the thing that’s going to make you rank. Our recent ultimate guide to content SEO should get you started on the right path there. Good luck optimizing your site in 2018!

Read more: ‘Structured data with schema.org: the ultimate guide’ »

http://ift.tt/2zlirFL

Friday, November 24, 2017

Ask Yoast: Yoast SEO premium features explained

As most of you will agree, the free version of Yoast SEO is already an awesome plugin. So, we understand that many of you frugal site owners and bloggers may be a bit reluctant to ‘splurge’ on Yoast SEO Premium. What more could the premium version have to offer? Well, as a matter of fact, it has a great deal more to offer! It has several features that help you drastically improve your site structure, make it easier to avoid 404s, and give more insight into your content. In this Ask Yoast, I discuss Yoast SEO premium features that are especially interesting for bloggers.

Alexa emailed us this question:

My audience are travel bloggers and I think they all use the free version of Yoast SEO. I don’t think they would pay for Premium unless they really understood the value. Can you share the main differences of the free vs. the paid version?

Watch the video or read the transcript further down the page for my answer!

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin Info Yoast SEO premium features explained

Well, there’s nothing I like more than an opportunity to tell you why you should give me money. So let me do that. There’s a couple of things in Yoast SEO Premium that I think are awesome for regular bloggers. One of those things is the redirect manager: if you change a URL somewhere you can easily redirect it; if you delete a post we will give you options to do something with that.

Even more important for bloggers is the internal linking feature that we have. We give you options for posts that you could link to from your current post. Based on what you’re writing about, we’ll tell you, “Hey, this looks similar to that post, you should link to that post.” We’ll give you that option. This will hugely increase how many internal links you have in your site. And because of having more internal links, people will stay on your site longer, your site will rank better, there’s lots and lots of benefits.

Yoast SEO Premium comes with a few more options; I’d encourage you to check out the Yoast SEO Premium page and well, go buy Yoast SEO Premium, is not that expensive. Good luck.

Ask Yoast

In the series Ask Yoast we answer SEO questions from our readers. Have an SEO-related question? Let us help you out! Send an email to ask@yoast.com.
(note: please check our blog and knowledge base first, the answer to your question may already be out there! For urgent questions, for example about our plugin not working properly, we’d like to refer you to our support page.)

Read more: ‘Why you should buy Yoast SEO premium’ »

http://ift.tt/2iNH6MW