Jump to content

W3CLove

Members
  • Posts

    4
  • Joined

  • Last visited

Profile Information

  • Interests
    w3c validation, web scraping

W3CLove's Achievements

Newbie

Newbie (1/7)

0

Reputation

  1. It uses a dedicated server, with the same software that runs on the W3C validator (the sources are available). So it does not make requests to the W3C, but it gets the same responses it would get from there.
  2. OK, I wanted to check if you already knew a better solution than the one I've crafted before presenting it. Please, if your consider that inappropiate, just tell me and I'll remove this post. I don't want to spam anyone, I really think this tool can save a lot of time to everyone. What I want to present here is W3CLove -- https://www.w3clove.com -- an online tool for validating whole web sites. You just need to enter the main URL of your site and click "validate". In seconds, you'll have a complete report of the errors and warnings found at your site (up to 250 pages), grouped and simplified. I'll appreciate your feedback on this tool, I'm really excited to present it here. Thanks!
  3. I strongly recommend git, it's really easy to use. There's a good introduction course at http://www.codeschool.com/courses/try-git For hosting, you could do it yourself with gitolite, but I don't think it's worth it. Github is free for open source, and Bitbucket is free even for private repositories (up to a limit of users or space, I think).
  4. Does anyone an easy way for validating an entire website (I mean, all of its pages or at least a good number of them) on the W3C validator at http://validator.w3.org ? This validator only supports validating a single URL, so if you want to ensure your whole web site is valid, it can take a lot of time.
×
×
  • Create New...