A FEW DASHBOARD UPDATES YOU MIGHT BE INTERESTED IN:
We have been busy working on our platform and updating our technology, and have some good news to share. Here are few important updates:
- Late last year, we realized that search engines like Bing were getting more aggressive in banning bots. This affected our keyword ranking (SERP tracking) module. We did some brainstorming and came up with the idea that, in order to get the most accurate results, we would have our technology replicate actual human visitor behavior. It’s with great pleasure and excitement that we are announcing this new SERP tracking technology for Bing keyword tracking. We hope to slowly use this technology for Google rankings, audits, and other key areas to future proof our results.
- In addition to regular search results, we will prioritize the tracking of other important sections in Bing SERP, including “People Also Ask”, Video, and Image search results, and show them on our dashboard.
- One of the most requested features by agencies, apart from the Local Map pack in Google SERPs, we are moving one step further to track local rankings by detecting your client’s website presence in Google Maps results.
- Keyword opportunities are always a great KPI you can show clients. We will be introducing a new section to show “suggested search keywords” for each keyword we are tracking. This offers additional keywords we can target and rank, increasing traffic to the website.
- Tracking multiple business names for a single business is difficult. So far, our system could track rankings for only a single business name. We will improve this system to track more than one name for a particular business. This gives us a chance to show more keywords that are ranking for a business.
- Scan your client’s website health for 100+ harmful issues with our much-improved, high-speed SEO crawler. Our advanced crawler will identify issues on your website that might be limiting its rankings. This should help make your client’s website more search-friendly, therefore improving rankings.
- Crawler audit data will also be introduced on a “per page” basis, giving accurate information about a particular page in addition to the whole website. This gives us a chance to resolve SEO errors at a minute level.
- Our website monitoring system is getting an overhaul with more accurate information. Our improved, reliable website uptime/downtime monitoring system will instantly alert you via email if your website goes down.
If you have any requests for new features or improvements in existing modules, please feel free to reach out to us.
GOOGLE NOW ALLOWS PRODUCT PLACEMENT AT THE TOP OF LOCAL & MAPS LISTING
Google My Business, which has been rechristened as Google Business Profile, now has a new feature that will allow businesses to mark a product as “Special”. This feature, which was recently rolled out by Google, appears in the Products section and takes the product marked as “special” right to the top of the products in a business’s Google Business Profile listing.
Here’s a look at what the product section looks like now:
The Google Business Profile Products Section Settings displays the option to “Mark as Special.” It clearly states that “Products marked Special are shown at the top of the page.”
What Makes This an Exciting Development
While not all businesses may see the need for this feature, a section of local SEOs and businesses have been looking for a way to promote certain products at the top of the page. This feature gives them the ability to put the best of their products high up in their Google local search and Maps listings.
WHY DO WEB PAGES DROP OUT OF SEARCH CONSOLE REPORTS?
A query was raised by a user regarding Search Console reports. The user stated they are having a problem with some of their web pages showing up and then dropping out of Search Console reports. Especially in regards to the Core Web Vitals report, they are seeing a steady decline in the number of pages reported from month to month.
Google’s John Mueller recently addressed this issue in an office hours segment. He said, “it’s a matter of how Search Console reports on web pages. Rather than attempt to include every page in the report, they use a sample of the URLs from your site. The number of pages they use to sample your site can vary from month to month in exactly the way that you’re reporting.”
He further added, “it’s something where having fewer URLs in these reports doesn’t mean that the other URLs are bad or problematic. It’s just we didn’t check them.
So especially for the aggregate reports, which is for the Core Web Vitals to some extent, the AMP report, the Structured Data report, mobile friendliness … for those reports, we only take a look at a sample … and that sample can change over time.”
He explains, “they might look at 200 URLs in one month and then in the next, they’ll look at maybe 100 URLs or something like that. It doesn’t mean that anything bad is happening; it’s just that Google is looking at a smaller sample of pages from your site.”
He advises that, instead of stressing out about the changing sample size, you should focus on what is actually being reported – specifically the relationship between the bad pages that are reported and the good ones.
If you see that all of the URLs in the report are without errors, you’re good to go. But, if you’re seeing errors, especially if the proportion of errors is increasing over time, you’ve probably got some fixing to do.
In any case, the total number of URLs in the report should not be your primary concern. It’s the relationship between the pages and the reported negatives you should focus on and attempt to fix.
You can watch the video below, where John discusses the issue, starting at the 51:33 mark: