Check out a new JavaScript SEO series → https://bit.ly/2UeQ8Do
Making a site that is engaging for users but performs well in Google Search requires combining specific server and client side technologies. Learn about the best practices to build and deploy indexable sites and web-applications with JavaScript frameworks. Lessons from this session can be implemented with most modern setups, whether you use Angular, Polymer, React, or other frameworks to build a full-fledged PWA or just use them for parts of a site. The session will also discuss search-friendly design patterns and SEO best practices.
Rate this session by signing-in on the I/O website here → https://events.google.com/io/schedule/?section=may-10&sid=e8a0d2f7-1eb2-4dc9-a69c-3f28d460b61f
Watch more Webmasters sessions from I/O ’18 here → https://www.youtube.com/playlist?list=PLKoqnv2vTMUPdfWwy6HBOAwTmEnZzrTlx
See all the sessions from Google I/O ’18 here → https://www.youtube.com/playlist?list=PLOU2XLYxmsIInFRc3M44HUTQc3b_YJ4-Y
Subscribe to Google Search Central → https://goo.gle/SearchCentral
#io18 event: Google I/O 2018; re_ty: Publish; product: Search Console – General; fullname: Tom Greenaway, John Mueller; event: Google I/O 2018;
source
Bom dia!
This presentation just begs the same question over and over: why not make Googlebot better? Shifting the burden onto all these web developers… or just improve Googlebot to handle modern practices? Oh, your indexing bot doesn't know how to read/index pages that a human can reason about? Sounds like your bot could be improved. It uses Chrome 41—why? etc.
Don't get me wrong, I think web developers should do all they can to improve SEO (especially with JSON-LD structured data), but some of these limitations of Googlebot are just annoying.
just learned about rendora and dynamic rendering, SEO problem is now solved
Why Googlebot still can not update the version of Chrome ??? (
you (google.inc) must make a video tutorial in indonesian language too.
Use React Static. Problem solved 🙂
This technical aspect is really important for the following up of website building.Truly thanks.
On my website I use the fragment #! and it is perfect for the users, I show the content without refreshing the whole page. But now Google does not recommend this and my site has fallen in terms of indexed pages and therefore its positioning too.
I do not understand why they do not take the content that comes after #!. Google always recommends focusing on users when the site is done, but this is no longer the case. Since in my case the site works perfect for users, they see the content, but now for Google this is insignificant and if now I have to change something from my code it is for Google to interpret it. Contradictory, no?
Anyway in search engines like Bing or duckduckgo this does not happen, there if they crawl all the content of my web. They say to make use of the API History, which I was trying to do and I can not make it work for my case.
So, do we focus on users or search engines?
You're missing a 'b' in a part of the info 🙂
"Watch more >Wemasters< sessions from I/O '18 here"
Just trying to help, keep being awesome and an inspiration! 🙂
Awesome video <3
Interesting point about the complete dynamic rendering for search bots user-agent and not users!
Questions: You mention using the mobile friendly tool and the rich results testing tool as rendering test platforms, essentially. Why do this instead of using Fetch and Render in Search Console? In fact the first time I tried to use the rich results tool it told me that the page was not eligible for "rich results known by this test."
The dynamic rendering is so ridiculous…. What make you think that I'm going to code like a #$%#! just to make your job simplier when implement that, requires an important infrstructure? Google many times does incredible things, but this…. this goes nowhere. I really don't think people are going to implement this, or if they try, they are going to leave it after try…..
Google, Please provide a link to the documentation regarding dynamic rendering and the official policy change.
Very useful information, loving the transparency.
How to make sure that Google will not consider Dynamic Rendering as a Cloaking? Previously there was a recommendation to not checking for a google bot.
I have a question: we work on an brand new site, which build on JS. We close it by robots.txt as we afraid that bot might index a lot of "empty" pages, that are without dinamic redenring… However, i am want to test and to see – how google bot will see those pages? But I can't test it until I unblock the robots.txt file, right? I mean – I even can't use GWT's "Fetch as google bot" while it is closed by robots.txt. So, what might be the solution to check how google bot will render my sites without openeing robots.txt file?
23:55 provides a solution of implementing server side rendering only for google bot. That might be a good solution, however, I thought that is considered as search engine cloaking (providing different result to users / bots), which will penalize your SEO… isn't it?
Superb work, Thanks a lot. Keep it up. 🙂
Thanks, Google!
Awesome info.. Thank you!
Great an helpfully information! Thanx Google!
Will GoogleBot use Chrome 59 in 2018 ?
https://www.search-foresight.com/googlebot-chrome-59/
Because, you know, ES20**.