Making website responsive with css - any seo concerns?

16 replies
  • SEO
  • |
Anything to look out for when making a site responsive with css code, with a web developer?

If the code goes right should it be on good terms with google?

-Thanks
#concerns #css #making #responsive #seo #website
  • Profile picture of the author yukon
    Banned
    The only thing that matters is the text version of your webpage (example).
    Signature
    Hi
    {{ DiscussionBoard.errors[9631455].message }}
    • Profile picture of the author Icematikx
      Originally Posted by yukon View Post

      The only thing that matters is the text version of your webpage (example).
      Hmm, not quite. Google has innovated quite a lot this year. I've just worked with a team of PHP Developers to build a site based on heavy Javascript usage, such as AngularJS. For most, this is a nono, but thanks to innovation, we're now able to serve Google and other search engines a different page than the user.

      Back in the days this would be clear cloaking. But now, there's finally a way to show Google what is really on our site through using _escaped_fragment_ and using a pre-rendering service.

      What this means is we can serve client-side javascript, and force the client to render the javascript, rather than the server - without negatively affecting on-site content.

      I believe web development is moving towards client-side javascript such as AngularJS, and would heavily suggest everybody with a spare hour to start researching it, and how _escaped_fragment_ works in with hash bang URL's.

      But yes, while text/html versions matter for now, we're already seeing Google rendering JS and other languages just fine. It's only a matter before their able to parse that data on a client-side rendering bot, at which point the whole game will change.

      I no longer build sites and worry about HTML/PHP code, as I know there are solutions out there to achieve the same thing while using more powerful languages like JS.
      Signature

      Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

      {{ DiscussionBoard.errors[9631541].message }}
      • Profile picture of the author yukon
        Banned
        Originally Posted by Icematikx View Post

        Hmm, not quite. Google has innovated quite a lot this year. I've just worked with a team of PHP Developers to build a site based on heavy Javascript usage, such as AngularJS. For most, this is a nono, but thanks to innovation, we're now able to serve Google and other search engines a different page than the user.

        Back in the days this would be clear cloaking. But now, there's finally a way to show Google what is really on our site through using _escaped_fragment_ and using a pre-rendering service.

        What this means is we can serve client-side javascript, and force the client to render the javascript, rather than the server - without negatively affecting on-site content.

        I believe web development is moving towards client-side javascript such as AngularJS, and would heavily suggest everybody with a spare hour to start researching it, and how _escaped_fragment_ works in with hash bang URL's.
        It doesn't matter what loophole your jumping through, Google is still stripping the page down to plain text & links with the exception of things like images which get filtered to Google Image search.
        Signature
        Hi
        {{ DiscussionBoard.errors[9631549].message }}
        • Profile picture of the author Icematikx
          Originally Posted by yukon View Post

          It doesn't matter what loophole your jumping through, Google is still stripping the page down to plain text & links with the exception of things like images which get filtered to Google Image search.
          That's the point though. It isn't a loophole. Google has been working with developers this year heavily to develop _escaped_fragment_ and bring it into mainstream. Google actually recommend using it.

          It's a fairly new concept to me, as it's a fairly new concept overall. But, after building a site using primarily AngularJS and serving client-side javascript, and having Google read that as TEXT/HTML, I'm absolutely amazed.
          Signature

          Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

          {{ DiscussionBoard.errors[9631553].message }}
          • Profile picture of the author yukon
            Banned
            Originally Posted by Icematikx View Post

            That's the point though. It isn't a loophole. Google has been working with developers this year heavily to develop _escaped_fragment_ and bring it into mainstream. Google actually recommend using it.

            It's a fairly new concept to me, as it's a fairly new concept overall. But, after building a site using primarily AngularJS and serving client-side javascript, and having Google read that as TEXT/HTML, I'm absolutely amazed.
            Do you have any links where Google recommended javascript with working examples? No offense but I would have to see that to believe it considering Googles 17 year history of dealing with plain text/links.
            Signature
            Hi
            {{ DiscussionBoard.errors[9631557].message }}
            • Profile picture of the author Icematikx
              Originally Posted by yukon View Post

              Do you have any links where Google recommended javascript with working examples? No offense but I would have to see that to believe it considering Googles 17 year history of dealing with plain text/links.
              Google wrote the AJAX Specification, and this has now been ported over to Javascript.

              Here is the AJAX Specification: https://developers.google.com/webmas.../specification

              Here's how one example prerender service works:
              • The middleware checks for _escaped_fragment_ or inspects the user agent of each request to your app
              • If a search engine crawler is detected, the middleware makes a GET request to the Prerender service
              • The Prerender service checks the cache for a response time less than 200ms!
              • If the page isn't cached, we make a call to your app for the same exact page that the crawler is requesting
              • The Prerender service renders the page with all of the javascript using phantomjs and (optionally) saves it in the cache
              • The HTML of that rendered page is sent back to the middleware in your app
              • The middleware then returns that HTML to the crawler

              Twitch use Javascript, and that's why this initiative has been developed. So many developers now are moving towards client-side javascript.

              In essence, you push the javascript to the browser, and the browser parses that and renders it - take a heavy load off the server, not to mention rapidly quick loading times.

              It is the future, and I'm embracing it as it comes. Far too many people will be left behind building PHP/HTML sites and not learn to appreciate the complex value of Javascript.

              For implemention:

              Include:
              <meta name="fragment" content="!">

              In your Javascript pages.

              Google will then scrap that URL, and add "?_escaped_fragment_=" to the URL.

              For example:

              Warriorforum.com/?_escaped_fragment_=

              This page is then served either by a third-party such as www.prerender.io, or is handled server side. We chose to handle it server side via Amazon's cloud storage.

              This page is served as a client-side rendered page, and the HTML/Text content is served to Google.

              This ofcourse, is not considered cloaking - hence why the meta name initiative was developed and implemented.

              You can build heavy JS sites, and serve SEO'd content to the search engines. There's no cloaking. You're building a better user experience by using JS, and it also gives you the opportunity to implement more complex code.

              It's a fantastic initiative, and I doubt we'll see it ported over to WordPress or any mainstream CMS's as of yet.

              If you want a fantastic JS based CMS, look into Triangulate CMS or Respond CMS - both of which are free.
              Signature

              Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

              {{ DiscussionBoard.errors[9631580].message }}
        • Profile picture of the author SEO-Dave
          Originally Posted by yukon View Post

          It doesn't matter what loophole your jumping through, Google is still stripping the page down to plain text & links with the exception of things like images which get filtered to Google Image search.
          Used to be true, but not anymore, Google are taking rendering into account.

          Official Google Webmaster Central Blog: Rendering pages with Fetch as Google

          Joost de Valk (developer of the Yoast plugin) has an interesting study at https://yoast.com/google-panda-robots-css-js/

          Since responsive design has an impact on rendering it's important what Google sees (what it renders).

          You just have to look at the results from a Google PageSpeed Insights test PageSpeed Insights On the Warrior forum forum for example it's not responsive, viewports not configured for example and Google knows when links are too close together that on mobile they'll be accidentally tapped.

          Google looks set to take these sort of rendering and speed issues (usability) more into account in the future. If your site isn't mobile optimized, it won't rank as well for those using mobile devices.

          There's loads about this on the usual SEO sites.

          Report: Mobile "Configuration" Errors Cause 68 Pct. Traffic Loss

          I guess I should duck for cover as the regulars to this forum who are 5+ years out of date on SEO start to attack me again :-)

          David
          {{ DiscussionBoard.errors[9631612].message }}
          • Profile picture of the author yukon
            Banned
            Originally Posted by Icematikx View Post

            And while the above post was slightly off-topic, I believe it's in the interests of all mainstream SEO's to start learning how Javascript can be served in a search-engine friendly way. Many clients will now begin using JS like AngularJS.

            The traditional method was, "time for a new site". Now, it's about taking that framework and making it search engine friendly, and understanding WHY it's SEO friendly. We all know the SEO game is changing, and this is yet another way to separate the self-proclaimed gurus from the big boys.
            Here's how I see it, I'm not interested in taking on clients that want to over complicate something when I know I can rank an HTML webpage. There's millions of other clients that use conventional webpages.

            I'm unlike most around here, If I'm not interested in a client project, I don't have a problem saying so. That's not meant to be offensive in any way, I just don't take on work unless it fits my needs.

            Google won't abandoned HTML in our lifetime.






            Originally Posted by SEO-Dave View Post

            Used to be true, but not anymore, Google are taking rendering into account.

            Official Google Webmaster Central Blog: Rendering pages with Fetch as Google

            Joost de Valk (developer of the Yoast plugin) has an interesting study at https://yoast.com/google-panda-robots-css-js/

            Since responsive design has an impact on rendering it's important what Google sees (what it renders).

            You just have to look at the results from a Google PageSpeed Insights test PageSpeed Insights On the Warrior forum forum for example it's not responsive, viewports not configured for example and Google knows when links are too close together that on mobile they'll be accidentally tapped.

            Google looks set to take these sort of rendering and speed issues (usability) more into account in the future. If your site isn't mobile optimized, it won't rank as well for those using mobile devices.

            There's loads about this on the usual SEO sites.

            Report: Mobile "Configuration" Errors Cause 68 Pct. Traffic Loss

            I guess I should duck for cover as the regulars to this forum who are 5+ years out of date on SEO start to attack me again :-)

            David
            ..again, your over complicating something that's historically simple (HTML). I might use javascript to break up a URL into chunks but that's about it.
            Signature
            Hi
            {{ DiscussionBoard.errors[9631635].message }}
            • Profile picture of the author Icematikx
              Originally Posted by yukon View Post

              Here's how I see it, I'm not interested in taking on clients that want to over complicate something when I know I can rank an HTML webpage. There's millions of other clients that use conventional webpages.
              Perhaps that's you. But times are changing Yukon, and all major brands are already utilizing Javascript to implement features of their sites. Whether that's showing dynamically loaded content or rendering the WHOLE page in Javascript - it's a powerful architecture that will gain traction quickly.

              If you want to be left behind and stick to HTML/PHP optimization, that's up to you. The future is here, right now, and it's down to everybody to embrace it and learn how and why it works the way it does.

              It isn't about "optimizing" your site for Google either. It's about a better user experience. It's about being able to convert people better. It's about dynamically showing pricings, shipping - everything..

              Off-Site SEO is so simple - build links and win the game. Build citations and rank locally. Build a PBN and see the rankings rise. One has to look further than that though, especially when working with huge companies who will always choose user experience and user satisfaction over Google search bot satisfaction.
              Signature

              Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

              {{ DiscussionBoard.errors[9631644].message }}
  • Profile picture of the author yukon
    Banned
    @ Icematikx

    To be honest that looks like it's over complicating simple things, HTML is far easier to read & work with for SEO.

    I've also never found a reason to use AJAX. Why serve a converted AJAX HTML snapshot when you can simply use HTML?
    Signature
    Hi
    {{ DiscussionBoard.errors[9631606].message }}
    • Profile picture of the author Icematikx
      Originally Posted by yukon View Post

      @ Icematikx

      To be honest that looks like it's over complicating simple things, HTML is far easier to read & work with for SEO.

      I've also never found a reason to use AJAX. Why serve a converted AJAX HTML snapshot when you can simply use HTML?
      We're not talking about AJAX Yukon. I'm talking about JAVASCRIPT, and CLIENT-SIDE rendered javascript.

      Traditionally, the server would render the javascript and parse out the information to the user. NOW, it's the client's browser that renders the javascript and it's served in a "virtual" environment that only the specific browser would see. This means extremely fast loading times. Literally, I'm talking sub-100ms loading times across a whole domain.

      Not to mention, Javascript is 50x more powerful than PHP/HTML. How do you think Twitch was built? Do you think Twitch used HTML, CSS and PHP? No, they use Javascript.

      I'm not talking about little affiliate sites that promote food processors on Amazon. I'm talking clients who serve tens of thousands of visitors a month, these are the people who are now implementing things like AngularJS to serve dynamic content to users. If you don't know how to optimize JS for Google and other search engines, you're in a bit of a ruffle.
      Signature

      Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

      {{ DiscussionBoard.errors[9631617].message }}
  • Profile picture of the author Icematikx
    And while the above post was slightly off-topic, I believe it's in the interests of all mainstream SEO's to start learning how Javascript can be served in a search-engine friendly way. Many clients will now begin using JS like AngularJS.

    The traditional method was, "time for a new site". Now, it's about taking that framework and making it search engine friendly, and understanding WHY it's SEO friendly. We all know the SEO game is changing, and this is yet another way to separate the self-proclaimed gurus from the big boys.
    Signature

    Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

    {{ DiscussionBoard.errors[9631610].message }}
  • Profile picture of the author SEO-Dave
    Icematrix, that sounds like a really interesting approach, are you running your results through Google's PageSpeed Insights Tool and getting good results? Got anything live to run through the test page?

    I've gone the other way in development, I've moved to avoiding using javascript that's render blocking, used to use a mobile responsive menu which relied on Jquery to serve the mobile menu on device widths below 800px, but Jquery is quite big and render blocking. Replaced the Jquery code with CSS code to render the same output.

    Did the same with an image slider that used Mootools javascript framework, replaced it with some interesting CSS.

    Fixed pretty much all major PageSpeed Insights issues other than render blocking CSS, next step would be deal with the render blocking CSS file, not sure if it's worth it. Would have to server enough inline CSS to render the page and load the CSS file in the footer.

    Problem with this approach is it reduces site features, there's a lot of flashy features that use Jquery for example. Are you saying your approach allows interesting features without the performance hit of using a lot of javascript on a PHP/HTML driven site, because the sites served completely with javascript???

    David
    {{ DiscussionBoard.errors[9631648].message }}
    • Profile picture of the author Icematikx
      Originally Posted by SEO-Dave View Post

      Icematrix, that sounds like a really interesting approach, are you running your results through Google's PageSpeed Insights Tool and getting good results? Got anything live to run through the test page?

      I've gone the other way in development, I've moved to avoiding using javascript that's render blocking, used to use a mobile responsive menu which relied on Jquery to serve the mobile menu on device widths below 800px, but Jquery is quite big and render blocking. Replaced the Jquery code with CSS code to render the same output.

      Did the same with an image slider that used Mootools javascript framework, replaced it with some interesting CSS.

      Fixed pretty much all major PageSpeed Insights issues other than render blocking CSS, next step would be deal with the render blocking CSS file, not sure if it's worth it. Would have to server enough inline CSS to render the page and load the CSS file in the footer.

      Problem with this approach is it reduces site features, there's a lot of flashy features that use Jquery for example. Are you saying your approach allows interesting features without the performance hit of using a lot of javascript on a PHP/HTML driven site, because the sites served completely with javascript???
      We're about to launch our new site this week using client-side Javascript. The loading times across the whole site are remarkably impressive. We're relying on the client to render everything.

      Look into this: Triangulate - Home

      Also AngularJS: https://angularjs.org/

      Also look at: Backbone.JS, Ember and the latest JQuery.

      Then for serving JS as SEO friendly content: https://prerender.io/

      It's a lot of reading and a steep learning curve, but I'm getting there...
      Signature

      Just got back from a #BrightonSEO. I was given room 404 in the hotel I stayed at. Couldn’t find it anywhere!

      {{ DiscussionBoard.errors[9631656].message }}
  • Profile picture of the author yukon
    Banned
    OP is probably more confused now considering he only asked about CSS which has nothing to do with AJAX or Javascript.

    Obviously reinventing the wheel isn't an option.
    Signature
    Hi
    {{ DiscussionBoard.errors[9631665].message }}
    • Profile picture of the author SEO-Dave
      Originally Posted by yukon View Post

      OP is probably more confused now considering he only asked about CSS which has nothing to do with AJAX or Javascript.

      Obviously reinventing the wheel isn't an option.
      LOL, that's a good point.

      OP if you build responsive CSS you should find Google will like your site more than if you don't. especially for mobile device searches.

      Use the Google Page Speed Insights tool I linked to to check for mobile issues. Aim for 100/100 on usability and aim for the 70s/100 for mobile speed.

      The main ones for mobile search are likely to be viewport issues, your site should either have a separate mobile version, or better yet one version that seamlessly works well on all device widths.

      Can be a lot of development work to get it right, just getting tap target issues fixed is a pain, if you can match my sites results PageSpeed Insights you've done really well.

      David
      {{ DiscussionBoard.errors[9631693].message }}

Trending Topics