Category: Industry Buzz

Looking Ahead to Drupal 8: What to Expect

Drupal 8's Killer Feature | RepEquity Blog

There is much to be excited about the 2014 release of Drupal 8. Upgrades included in this version of the open-source content management system will benefit just about everyone, including marketers, content creators, developers and ultimately website users.

What’s new?

This version, which will come more than three years after the release of Drupal 7, is a step forward in modernizing and maturing the software. Drupal 8 has more than 1,600 contributors – nearly double the number that contributed to the previous release. Drupal 8 includes roughly 200 new features that improve accessibility, the content authoring process, search engine optimization (SEO), and asset management. The new version will also bring Drupal up to current standards for delivering web content.

With so much change, I’m often asked: What is Drupal 8’s killer feature? What will Drupal 8 do that we can’t live without? The answer depends on how you use Drupal.

Read More

7 Things I Learned at SMX East 2013

Last week I attended my first search conference, SMX East, in New York City. The conference is composed of diverse sessions covering the latest search engine optimization (SEO), search engine marketing (SEM) and social media marketing tactics. Not only was it great to hear from some of the top players in the search industry, but it was also an opportunity to network and share tips with search marketers from across the country and around the world. Here are my top seven takeaways from the conference.

Acquiring natural links and staying penguin proof

In my first session, industry experts talked about leveraging niche audiences for natural links by becoming part of their tribe. If we take the time to understand our audience, we can evoke their interest through the creation of tailored content. By becoming part of their community and providing them with useful content, we can obtain natural links for free.

Diversify your digital marketing

Content can be used in many ways.  For example, a whitepaper can also be used to create podcasts, presentations, press releases, webinars, blog posts, email newsletters, interviews, weekly roundups and more.

Infographics are a great source for gaining social signals, traffic and links from other blogs. To leverage another social platform, take Infographics and slice them up for SlideShare using either the same content repurposed or slightly altered companion content. SlideShare allows for direct connection to Twitter on every slide, enabling users to tweet content from the slide straight to their followers without ever leaving your content.

The most important search ranking factors

Mathew Peters from MOZ shared a study that polled SEO’s about what they thought would become the most important search ranking factors. According to the poll, SEO’s believe several non-traditional search elements will heavily influence SEO rankings including quality, authorship, structured data and social signals.

“Twitter is the second screen to TV”

Richard Alfonsi, Vice President of Online Sales at Twitter, opened Day 2 at SMX East with a keynote conversation. Today, 95% of public conversations about television are happening on Twitter. Twitter as a second screen to TV viewers allows for a shared experience; it makes watching TV richer and energizes people. Twitter has seen that conversation around TV shows can impact and drive viewership, ultimately increasing rankings.

Semantic Data & Semantic SEO

Semantic SEO provides open, structured data to increase search visibility by connecting users with information they search for. This enables search engines to more effectively organize the web. Beyond just keywords, unique identifiers help search engines determine the meaning behind a particular query, and rank for more relevant information and more qualified traffic.

Using authorship in SEO can prevent spam, highlight quality authors and help rank quality content. It is speculated that higher visibility using the authorship tag will lead to a higher click through rate. But, it is not enough to just claim authorship; we need to be active on Google+, spend time building circles, generate more engagement and notifying fans on other social networks.

Some panelists cautioned conference-goers that by using semantic SEO you are providing information for Google and Bing to use in the knowledge graph and snap shot, potentially preventing traffic to your site. Currently, there is no real way to measure how to link clicks back to enhanced search results.

Content: The good, the bad, and the ugly

Not sure what to write about? Consider doing competitor research. By checking out what type of content is on your competitor’s top pages that have inbound links, search marketers can gain valuable information on what kind of content a particular audience is looking for. Also, look at what is trending on social media platforms like Reddit. Google News can also be a good source to find out what is hot right now.

Hummingbird

It wasn’t until Day 3 during a session on the future of SEO that a panel including Danny Sullivan, Brian White (Google) and Duane Forrester (Bing) that the mysterious Hummingbird algorithm was discussed. So far we know that Hummingbird is a full rewrite of the Google search algorithm, the first time Google has done so in years. White explained that there is a stronger focus on keywords having meaning behind them and these meanings connect with each other. Through a greater balance of signals, Hummingbird will allow Google to better understand what users want from their queries. White left a lot of questions unanswered. We will have to keep an eye out on the impact of Hummingbird!

Did you attend SMX East? I would love to hear your main takeaways! Comment below and follow me on Twitter @RachelRacoosin.

Why Online Reputation Management is Never Complete

When we start a new program for a RepEquity Online Reputation Management client, we follow a standard checklist for getting their program up and running. Once we’ve collected all relevant information from the client, we begin by claiming or building out any existing social media profiles. We then build the client a complete bio website that highlights their involvement in business, philanthropy and other arenas. These two steps are the foundation of nearly every ORM program we run and become our leveraging points for cleaning up the search engine results for individuals, corporations and nonprofits.

An online reputation issue is not something that can be quickly fixed and then ignored. It is a constant battle to keep positive results at the top of the search engine results and as an ORM company we remain vigilant for new rankings and strategies that would help our clients.

Updates to Google’s Algorithm

This year, we saw two major algorithm updates by Google that shook up organic search engine results.

Originally released in 2011, the Panda algorithm update was updated to lower the rankings and visibility of low-quality sites and push higher-quality sites up in the search results. This benefitted our clients by increasing rankings for their social media profiles, but in some cases it also increased visibility of potentially harmful, untrue, or out-of-date content on news websites.

Google updated its Penguin algorithm on April 24, 2012.  Similar to Panda, Penguin was intended to decrease search engine rankings of websites found to be in violation of Google’s Webmaster Guidelines by using black-hat SEO techniques like keyword stuffing, cloaking, and deliberate duplicate content. Penguin affected an estimated 3.1% of search queries in English—hugely impactful for an algorithm update. Penguin prompted us to take a closer look at some client websites that were in place before signing with RepEquity. We helped clients eliminate duplicate content and analyze their backlinks for potential red flags.

Read More

A Perfect Ten? Try a Perfect Seven

Over the past few months we’ve noticed that about 10% of the time, fewer than 10 results display on Google’s first search engine results page (SERP).  Why are fewer domains making the all-important first page of Google results?  Yesterday Google answered that question:

We’re continuing to work out the best ways to show multiple results from a single site when it’s clear users are interested in that site. Separately, we’re also experimenting with varying the number of results per page, as we do periodically. Overall our goal is to provide the most relevant results for a given query as quickly as possible, whether it’s a wide variety of sources or navigation deep into a particular source. There’s always room for improvement, so we’re going to keep working on getting the mix right.

In a recent article on SEOmoz Peter J. Meyers, the president of User Effect, points out that historically 1% to 4% of SERPs have displayed fewer than 10 results. But that percentage jumped to over 18% early last week. While some of these SERPs displayed eight or nine results, most displayed just seven.

Meyers notes that the SERPs impacted are mostly for branded keyword searches (i.e., searches for company names).  The SERPs for branded keywords tend to lack diversity in domains; that is, many of the top 10 results are from the same domain.  He speculates that Google eliminated three search results to increase domain diversity, or the variety of domains that rank on the first page.  He also points out that many of the SERPs displaying fewer than 10 links include sitelinks displayed within the top ranking.

What this Means for You

On the plus side, we already work with our clients to ensure domain diversity by maintaining active accounts on well-known social media sites and distributing content across multiple domains.  For example, the Google search results for “RepEquity” (pictured below) include the RepEquity Blog and RepEquity Labs sites in addition our main site, www.RepEquity.com.

RepEquity Search Results

This change could also mean a huge win for clients fighting false reports on sites like Ripoff Report, Scam.com and others.  According to a Search Engine Watch report, Google is looking hard at verified data when deciding what content ranks—or not. “Verified data in this case seems to be any source that has to go through a fairly rigorous verification process,” writes associate editor Danny Goodwin.  Sites with unverifiable sources of data, like Ripoff Report, may be penalized and have a harder time ranking on page one.

It’s not unusual for Google to favor content on certain domains that it deems to be highly relevant and credible.  On the SERPs that include fewer than 10 results, we are seeing evidence that Google may favor well-known domains like wsj.com or usatoday.com, even if the content (in this case news) is outdated.  This could negatively impact clients who may be haunted by negative press from long-ago indiscretions.  If this is a problem for your brand, please contact us.

Recommendations

As Google continues to work on the SERPs, these tips will help you weather the storm, even if Google returns to displaying 10 results on each page.

  • Use Google Webmaster Tools to implement sitelinks for your branded pages.
  • Claim your company name on social media sites like Twitter, Facebook and LinkedIn.
  • Ensure all of your web sites and microsites are content-rich and optimized for search.

If you have questions about the recent changes in Google’s SERPs or need help managing your brand’s online reputation, contact us.

 

Quick Response, Slow to Catch On

RepEquity QR Code

Quick response (QR) codes add interactivity to offline placements including magazine ads, posters, billboards, price tags, and even t-shirts and beer cans.  QR codes are at a crossroads in the U.S.: Will they turn out to be a fad or will they eventually become as common as Facebook and pay-per-click (PPC) advertising? We don’t expect to know the answer to that question in 2012 and will continue to monitor how brands use QR codes to deliver interactive mobile experiences.

While QR codes are certainly the buzz in our industry, today they are not widely understood or adopted in the U.S. In June 2011, roughly 14 million U.S. mobile users scanned a QR code with their phone, according to data from comScore MobiLens. That number, while large, represents only 6.2 percent of the country’s total mobile audience. In a study by Lab42, 58 percent of respondents answered “no” to the question “Are you familiar with QR codes?”  Of those, 43 percent said they did not know what a QR code is. Read More

Older Entries