You are reading the article Google’s John Mueller & Martin Splitt Answer Questions About Site Speed updated in November 2023 on the website Tai-facebook.edu.vn. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested December 2023 Google’s John Mueller & Martin Splitt Answer Questions About Site Speed
Google’s John Mueller and Martin Splitt teamed up for a special edition of the ‘Ask Google Webmasters’ video series to answer questions about site speed.
Together, they each took turns answering a series of rather technical questions from SEOs – even some that appeared to go over Mueller’s head.
Here’s a quick recap of each question and answer.Question 1: Ideal Page Speed
“What is the ideal page speed of any content for better ranking on SERP?”
When Google measures page speed for the purpose of ranking content it generally separates pages into two categories: really good or pretty bad. Splitt says there is not much of a threshold between those two extremes as far as Google’s algorithm is concerned.
Presumably that means there is no “ideal” speed. Just make sure pages are fast enough to fit into the “really good” category. Again, no specific measurements were given, but it helps to look at it from a visitor’s perspective and think about whether they would find your site fast or slow.Question 2: Importance of Google PageSpeed Insights
“I wonder, if a website’s mobile speed using the Test My Site tool is good and GTmetrix report scores are high, how important are high Google PageSpeed Insights score for SEO?”
Mueller recommends using each of these different tools and looking at the data to discover low hanging fruit on your web pages. That refers to anything that could be easily improved to give pages a boost in speed. So look for the quick wins, in other words.
Splitt answers the question from the perspective of someone giving the report to a third-party – such as an SEO giving the report to a developer to make improvements.
Each of the tools in question measure slightly different things and present the results in different ways. So Splitt suggests being mindful of that and using the tool that’s best for your audience.Question 3: Chrome DevTools
“I am testing an almost empty page on #devtools Audits (v5.1.0) it usually gives minimum results which 0.8ms for everything and 20ms for FID (First Input Delay) but sometimes it gives worse results in TTI (Time to Interactive), FCI (First CPU Idle) and FID. Same page, same code. Why?”
Splitt says these measurements aren’t always exact and there will always be some fluctuation between them. Even a fluctuation between 0.8ms and 20ms is not considered unusual, and Splitt notes that it takes roughly 20ms for a single frame to draw.
With that in mind, try not to get too hung up on numbers unless you see something wildly out of the ordinary. If you start seeing measurements ranging from 20 seconds to 1 minute, for example, then it’s time to be concerned.Question 4: Best Metrics to Analyze
“What is the best metric(s) to look at when deciding if page speed is “good” or not? Why / why not should we focus on metrics like FCP (First Contentful Paint) / FMP (First Meaningful Paint) instead of scores given by tools like PageSpeed Insight?”
Short answer – “it depends.”
The importance of these metrics vary according to what users are doing on your site. If users are just going to your site to read content, and not really interacting with anything, then FMP & FCP would be the more important metrics to look at.
On the other hand, if it’s an interactive web application where people are landing on the page and immediately interacting with things, then metrics like TTI & FCI would be the best ones to look at.
Splitt and Mueller both agree that page speed is more complicated than what can be conveyed by any single metric. They recommend looking at multiple sets of data and finding areas of improvement according to what’s most important for your audience.
See the full video below:
You're reading Google’s John Mueller & Martin Splitt Answer Questions About Site Speed
Google’s John Mueller recently explained how query relevancy is determined for pages blocked by robots.txt.
It has been stated that Google will still index pages that are blocked by chúng tôi But how does Google know what types of queries to rank these pages for?
That’s the question that came up in yesterday’s Google Webmaster Central hangout:
“Nowadays everyone talks about user intent. If a page is blocked by chúng tôi and is ranking, how does Google determine the query relevancy with page content as it’s blocked?”
In response, Mueller says Google obviously cannot look at the content if it’s blocked.
So what Google does is find other ways to compare the URL with other URLs, which is admittedly much harder when blocked by robots.txt.
In most cases, Google will prioritize the indexing of other pages of a site that are more accessible and not blocked from crawling.
Sometimes pages blocked by chúng tôi will rank in search results if Google considers them worthwhile. That’s determined by the links pointing to the page.
So how does Google figure out how to rank blocked pages? The answer comes down to links.
Ultimately, it wouldn’t we wise to block content with chúng tôi and hope Google knows what to do with it.
But if you happen to have content that is blocked by chúng tôi Google will do its best to figure out how to rank it.
You can hear the full answer below, starting at the 21:49 mark:
“If it’s blocked by chúng tôi then obviously we can’t look at the content. So we do have to kind of improvise and find ways to compare that URL with other URLs that are kind of trying to rank for these queries, and that is a lot harder.
Because it’s a lot harder it’s also something where, if you have really good content that is available for crawling and indexing, then usually that’s something we would try to kind of use instead of a random robotted page.
So, from that point of view, it’s not that trivial. We do sometimes show robotted pages in the search results just because we’ve seen that they work really well. When people link to them, for example, we can estimate that this is probably something worthwhile, all of these things.
So it’s something where, as a site owner, I wouldn’t recommend using chúng tôi to block your content and hope that it works out well. But if your content does happen to be blocked by chúng tôi we will still try to show it somehow in the search results.”
On a Google Webmaster Hangout, Google’s John Mueller was asked if the author bio page was necessary in order to meet Google’s E-A-T guidelines. Mueller’s response downplayed the necessity of author bio pages as a technical issue and suggested it was a user experience issue.Authorship Signals
The SEO industry believes that Google’s Quality Raters Guidelines describe how to rank better in Google. It is from those guidelines that the belief that naming who the author is and listing their biography and credentials are a technical requirement to check off the SEO ranking signal list.
But Google has never said that authorship biographies were a ranking signal. And the Quality Raters Guidelines were never represented by Google as listing ranking related signals.
John Mueller’s answer does not recommend authorship as a ranking signal. Instead, he frames it as user experience issue.
Expertise, authority and trustworthiness are important. But they are not the entire algorithm.The Necessity of Author Biography Pages
The webmaster was concerned that their author biography pages were not being seen by Googlebot because they were noindexed. Noindex is an HTML element that tells search engines to exclude a web page from the search engine index.
The concern was that because Google could not see the author biography pages that this would have a negative effect on rankings. But according to Google’s John Mueller, that’s not the case at all.
Here is the question:
“Can you speak to the necessity of E-A-T and author biography pages linked from an article?
….So, kind of the necessity of the author biography pages. Should we have the author’s credentials on the article itself or is linking to their biography by their byline good enough?
We have an issue where the author bio pages are meta noindex. Does it stop GoogleBot or Google Quality Raters from accessing the pages?”
John Mueller began by trying to define what the acronym E-A-T meant.
He raised his head and stared upward for a few seconds trying to recall what it meant.
John Mueller then went on to incorrectly recall what the acronym E-A-T stood for. He repeatedly referred to the “A” as Authority. Google’s Quality Raters Guidelines consistently referred to the E-A-T as Expertise, Authoritativeness and Trustworthiness. It’s authoritativeness, not authority.
Here is John Mueller’s response:
So E-A-T is expertise, authority, trustworthiness, I think…
And it comes from our quality raters guidelines, which are basically the guidelines that we give the folks who help us to improve our algorithms overall.
So… first of all it’s worth keeping in mind that our quality raters help us to improve our algorithms overall. They do not review individual websites.
So it’s not something where you need to optimize your websites for access by quality raters.John Mueller Downplays Author Bio Pages
Mueller does not at any point indicate that the author biography page is an important SEO factor. There is no indication from him that it is important to show the author bio. Instead, he focuses on how it impacts site visitors.
Here is what Mueller said:
With regards to author pages and expertise, authority and trustworthiness, that’s something where I’d recommend checking that out with your users and doing maybe a short user study, specifically for your set up, for the different set ups that you have, trying to figure out how you can best show that the people who are creating content for your website, they’re really great people, they’re people who know what they’re talking about, they have credentials or whatever is relevant within your field.Author Pages May Not be Required
Mueller then went on to state that author biographies are not a technical issue that needs to be addressed.
This contradicts a pervasive SEO belief. Many SEOs insist that failure to include an author biography could result in a loss of rankings.
Google’s Quality Raters Guidelines were not produced to give insights into Google’s algorithm. Yet members of Google’s own Webmaster Help forums treat the information as if it holds insights into why a site may have lost rankings.
So that’s less something I would focus on this as a technical thing like you need to do this, this and this or this type of markup for these pages but rather more as a quality thing, as a user experience thing where you can actually do user tests with your users directly.Author Biographies are Not a Ranking Signal?
There is no evidence that an author biography is a ranking signal. That’s something that’s so easy to fake, that it makes sense to not make it a ranking signal. So maybe it’s time to move beyond the endemic reductionist thinking that seeks to miniaturize Google’s algorithm to simple technical factors.
There is no evidence that author biographies are the critical ranking factor that many in the SEO claim it to be.
Watch the Webmaster Hangout here.
John Kuntz Sprinkles Dark Humor Through Salt Girl Solo performance debuts at Playwrights’ Theatre
In the slide show above, John Kuntz (GRS’05) discusses his one-man play The Salt Girl, premiering November 5 at the Boston Playwrights’ Theatre.
John Kuntz likes to talk to himself. “I do it all the time,” he says. “And I have fabulous conversations with myself.”
That habit comes in handy on stage — the playwright and actor has written and starred in six one-man shows. His most recent, a darkly comic drama called The Salt Girl, debuts tonight at the Boston Playwrights’ Theatre.
“In my other one-man shows, I generally play between 30 and 40 roles,” says Kuntz (GRS’05). “The Salt Girl is different because I play only one — for two whole hours.”
Told from the viewpoint of Quint, a lonely, middle-aged gay man, The Salt Girl explores the character’s search for identity, acceptance, and closure. The play opens as Quint is about to commit suicide. Moments after he swallows a handful of pills and ties a plastic bag over his head, his cell phone rings.
“Your father was in an accident,” the caller tells him. “His condition is serious.”
Past and present collide as Quint sits with his comatose father and revisits painful memories. Kuntz moves through decades, transforming from embittered adult to awkward teenager, and back. He recalls his older sister, who vanished before he was born, his mother, who died when he was a child, and former lover Ted, who seduced him when he was 15.
“I am absolutely fascinated by Quint,” says director David R. Gammons. “I feel so much for him — for his loneliness and disconnection.”
To emphasize Quint’s detachment from reality, set designers constructed an enormous wall of 25 television sets that play snippets of old movies, commercials, and home videos.
“The screens represent fragments of Quint’s memories, fantasies, and desires,” says Gammons. “Most one-person shows don’t have a set at all, and we’ve created this dense and layered world with lots of props and music and projections.”
Kuntz, a graduate of the GRS Creative Writing Program, has written 14 plays, including the solo show Starf*ckers, winner of an Elliot Norton Award. Another, Jasper Lake, debuted at the Boston Playwrights’ Theatre in 2004 and went on to receive Kennedy Center honors, with the Paula Vogel National Playwriting Award of the center’s Michael Kanin Playwriting Awards Program.
He compares acting in a show that he’s also written to being inside a clock. “When I’m writing a play, I see only the face of the clock,” he says. “But when I’m acting in it, I can see all the gears and say, ‘This is too long and this needs to be moved.’”
Being both playwright and actor saves time because “it eliminates the bickering that typically occurs between the actors and playwright,” Kuntz continues. “The playwright has the final say, of course, but sometimes the actor is right. And because I’m both — well, I guess I’m always right!”
The Salt Girl opens November 5 at 7:30 p.m., and runs Thursdays to Sundays at the Boston Playwrights’ Theatre, 949 Commonwealth Ave., through November 22. Tickets are $30 for general admission, $25 for seniors, $10 for students (ID required) and may be purchased online, by phone at 866-811-4111, or in person at the Boston Playwrights’ Theatre. Performance times vary; check the calendar. This play contains nudity. For more information, call 617-353-5443.
Vicky Waltz can be reached at [email protected].
Explore Related Topics:
Your browser does not support the audio element.
In October 2023, Google first announced its plan to prioritize a mobile search index over their desktop index.
Since then, this shift to a mobile-first index has been one of the most talked about – and sometimes misunderstood – topics in the industry.
With its full implementation happening soon, it’s worth taking a look at how prepared SEO professionals are for this major change.
For episode 145 of Search Engine Nerds, I spoke to Bastian Grimm, director of organic search at Peak Ace AG, who shared insights on Google’s mobile-first index and other factors that SEO professionals should take into consideration when preparing for the shift.What are your thoughts in general about the mobile-first index?
Bastian Grimm (BG): It’s kind of an interesting thing… for quite a while – even now – no one really understood or probably some struggled a bit to really understand what’s going on.
It was roughly a year ago when I first figured out that the mobile-first index is essentially switching things around…
They’ll essentially [be] taking the mobile web presence and rank things based on that – and not as they do it today using the desktop [index].
Right now it doesn’t really matter from a scoring perspective if a mobile site is fast or not.
However, I guess from the user perspective that’s a totally different story. Because we don’t want to wait for a mobile site to load. I think it’s even worse if a mobile site is slow compared to a desktop site.With Google announcing that page speed is going to be a ranking factor, do you think that means we’re about to see the rollout of mobile-first? Do we need to start paying attention to scores more than ever?
BG: I think, essentially, you’re right. Google making a clear stand on things is actually a rare thing… they’re trying to make sure that people get its significance.
However, though there’s also the other side of the story, that if you look at everything around speed and then obviously at some point you have to at least mention accelerated mobile pages, the AMP initiative, which I honestly think has at least a political taste to it in a sense that you couldn’t start the AMPs without HTTPS, now you need HTTPS.
Then you could say that’s just another way for Google, where people have to actually adapt to things that they want them to do. You could argue that the same is true for speed in a sense as well.
However, though, to be honest, I see the benefit of it. We all are not willing to wait for a site to load.
I guess it’s not so much about SEO. Yes, granted we might have better crawl efficiency. But generally it’s about user experience – it should be first and foremost.What do you think is a good page speed?
BG: I think first and foremost, the PageSpeed Insights scoring that Google threw out there a couple of years ago with the tool that they provided is really problematic because this number just doesn’t reflect how fast the site is at all.
Forget about the PageSpeed Insights scoring. What the tool is recommending is just not applicable.
The second thing is Time Spent Downloading that we have in the Search Console – another number which is just not relevant at all.
Then the two-second loading time, I think that’s been out there for a while… There was just a recent study from Nielsen where they surveyed quite a bit of people and actually that was the same outcome. So you have this two seconds, maximum three. So that thing seems to be quite valid.
However, though I think if you look at measurement, that’s one of the biggest issues that we have on the performance side.
The state of the art measurement up until now was to go with something like chúng tôi and then use the Speed Index and maybe Time to Interactive. I think even Google figured at some point out that that’s not enough.
What they did, I think it was Chrome  I believe. They introduced something called the Performance Observer. The idea is to measure with, in this case, GA, the different paint timings.
When does the most relevant element on a site really appear and can you consume that?
If you think, in the YouTube logic, the only thing that you really want is to watch the video and that video is your hero element. That’s the thing that needs to be there really, really fast.
I kind of approach it a bit differently and look at those different paint timings – Time to First Paint and Time to First Meaningful Paint of this kind of hero element.Where can you measure the paint timings?
BG: I think that there are two ways of doing that.
Performance Observer via Google Analytics: What you have to do is extend your GA code with Performance Observer and then it just measures these paint events. They show up in the custom metrics in GA.
Lighthouse: Lighthouse also measures paint timings and it’s way easier to do, just go to Chrome Dev Tools. You can run the same tests on mobile speed and then they use those paint timings and give you an idea of how fast your site is depending on the mobile connection.Should people be making mobile versions of their site or is responsive really the best way right now for being mobile-friendly, mobile-ready?
Google has been stating over and over, and I believe it makes sense, that if now on your desktop site, you have structured data and all the other kinds of nice optimizations already in place and you haven’t done that on the m-dot, well you have to synchronize things because if they flip it over and then rank you based on the m-dot and you don’t have the stuff there, how can you rank?
I think that was the core reason for it. The downside though if you turn it around, performance optimization on a responsive site is actually way harder.
But if you do it right and you do it really well, I believe responsive is a great solution because you don’t have to maintain different sites for different kinds of devices really.To listen to this Search Engine Nerds Podcast with Bastian Grimm:
Listen to the full episode at the top of this post
Subscribe via iTunes
Sign up on IFTTT to receive an email whenever the Search Engine Nerds podcast RSS feed has a new episode
Listen on Stitcher, Overcast, or Pocket Casts
Visit our Search Engine Nerds archive to listen to other Search Engine Nerds podcasts!
E-mail security vendors are trying to lure Postini customers now that Google has announced plans to shut down the unit and migrate its customers to Google Apps.
Competitors announcing special offers for Postini customers include Barracuda Networks and Proofpoint. Meanwhile, on Twitter, the owner of Micro Enterprises told Postini customers: “We can help!” while Spambrella tweeted that “we welcome all current users of this service.”
Postini rivals are reaching out to customers who may be uncertain about or opposed to Google’s plan to move them to Apps, its cloud-based e-mail and collaboration suite.
Barracuda Networks is offering Postini customers six months of its Email Security Service and its Spam & Virus Firewall for free, while Proofpoint promises them a “free and easy migration.”Google Confident
However, Google maintains that it’s going about this migration in way that will let it retain its Postini clients. “We don’t want to lose these customers,” said Adam Swidler, a senior manager with Google’s Enterprise division. Postini is used to protect about 26 million end users.
Google has been transferring Postini functionality to Apps for several years, and had said in the past that its ultimate goal was to eventually migrate all of it. In the past two weeks, it has provided more details about this transition. Recent reports in the press and in social media outlets have incorrectly reported that Google plans to “kill” Postini and leave customers up in the air.
Rather, Google will offer them a migration path, and once they’re on Apps, they will get similar but improved functionality, and on a technical platform that is stronger, Swidler said.
Postini’s email security service ranked low in a Gartner customer satisfaction survey in May, according to a recently published research note by Gartner analysts Peter Firstbrook and Matthew W. Cain.
“This migration may relieve some of that dissatisfaction,” they wrote. “Google Apps consistently upgrades its management console to reflect enterprise needs; Postinis console rarely saw improvements.”
Users of the Postini product known today as Google Message Security, which includes spam and virus filtering, along with e-mail policy management, will get that functionality as part Google Apps for Business.
Users of Postini’s Google Message Discovery will find its functionality replicated in Apps for Business and Apps Vault, an add-on that offers retention, archiving and e-discovery capabilities for email and chat messages.
Apps for Business costs $50 per user, per year, and with Apps Vault it costs $120 per user, per year. However, Postini users will continue to pay what they pay today for the Postini products — $12 per user, per year for Google Message Security and an incremental $13 per user, per year for those who also use Google Message Discovery — as long as they don’t use any of the other Apps suite components, in which case the price would increase.
Regarding Postini’s Google Message Encryption, Google said in an FAQ that it will provide more information about its plans “later this year.” However, Swidler said Google plans to continue working with its current partner on this service, Zix Corp., to offer its cloud encryption service as an add-on option to Google Apps. Two other Postini products — Google Message Filtering and Postini Small Business Edition — have very small user bases and will be phased out.Transition Period
After the transition, Postini customers will be able to continue using the Google e-mail security and discovery software in conjunction with their current e-mail clients and server software, such as on-premise Lotus Notes and Microsoft Exchange systems.
The transitions, which are expected to begin in earnest next year, should be “seamless” for those customers who have “straightforward” Postini configurations, he said.
In cases of more complex configurations, customers may need to pay attention to certain areas where some features have been re-architected in Apps, he said.
For now, Google is working on developing tools and utilities that’ll help make the transition process “as transparent as possible for users,” he said.
For Google Apps customers, the years-long process of transferring — and improving upon — Postini functionality has been aimed at making the suite stronger in the areas of the e-mail security, discovery and encryption. Longer term, Google plans to broaden the scope of Apps Vault so that it can also encompass all documents and data contained in the suite, in particular on its Drive online storage component, he said.
Juan Carlos Perez covers enterprise communication/collaboration suites, operating systems, browsers and general technology breaking news for The IDG News Service. Follow Juan on Twitter at @JuanCPerezIDG.
Update the detailed information about Google’s John Mueller & Martin Splitt Answer Questions About Site Speed on the Tai-facebook.edu.vn website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!