Tag Archives: Standards

HTML5 Gets a Logo

The W3C has put up a microsite with the new logo for HTML5. It looks pretty good, certainly better than its predecessors, even if it does have a bit of the “Web 2.0″ look that is finally starting to lose its novelty. I like that they kept the gradients to a minimum, which is starting to become more common in the aftermath of the glossy “Web 2.0″ style.

You can pick up SVG and PNG versions of the icon there, as well as T-Shirts featuring the logo…and free stickers if you live in the U.S. and have some spare postage stamps laying around.

Webmonkey brings up an interesting, and troubling, point about the HTML5 Logo site. The FAQ calls the logo a “general-purpose visual identity for a broad set of open web technologies, including HTML5, CSS, SVG, WOFF, and others.”

It doesn’t really matter if the New York Times thinks CSS 3 or SVG are HTML5, but we’d like to think that at least the organization in charge of describing what is, and is not, HTML5 would make some effort to distinguish between tools. Lumping everything together is as silly as a carpenter referring to every tool in their toolkit as “a hammer.”

That doesn’t sound very good to me. Is “HTML5″ becoming the new buzzword to replace “Web 2.0,” and sanctioned by its own standards body?

Update: And now it sounds like the HTML spec is no longer going to have specific version numbers

A Standard to Specify a Canonical Short Link

There has been a small push to create a standard way for a web page to specify a preferred short link for use in places like Twitter. Something like the rel="canonical" trick that tells search engines which page on your domain is the one that should be indexed. Basically, a meta tag to put in the page header, which could then be read by Twitter applications. The end goal is to help reduce the issue of “link splintering,” where everyone ends up linking to the same page with a different URL. (For instance, I could shorten a link to this page with Is.gd, then three others could create their own different Bit.ly links…)

One proposal is rev=”canonical”, but I really don’t I don’t like that option. This comment sums it up pretty well. Rev is too easily confused with rel, and is deprecated in HTML5 to boot. The “canonical” terminology also isn’t fitting, since it implies that the short URL is the preferred URL for the page (i.e. “the short link is preferred over the full one”) rather than an alternate link.

I found it interesting to learn that WordPress 3.0 is going to start automatically including something along the lines of this on permalink pages:

<link rel='shortlink' href='http://fantasyfolder.com?p=32' />

There will be hooks to override it with your own URL (so a plugin could place a single Bit.ly or YOURLS link there on publication), but the URL is irrelevant for the purpose of this discussion. The rel='shortlink' part is what interests me. I think it’s the perfect term to use for this scenario.

I think, whether you use WordPress or not, rel="shortlink" is what you should go with. (If you’re worried about controlling short links, at least.)

Learning oEmbed

WoorkUp has an interesting post on oEmbed, and how you can use jQuery to take something like a YouTube or Flickr URL and automatically load the video or image on the page. Facebook uses this technique to fetch thumbnails and descriptions when you post a link.

WordPress 2.9 also includes oEmbed functionality, allowing you to easily add YouTube videos to your posts, simply by wrapping the video page’s URL in a pair of “embed” shortcode tags.

WoorkUp shows how you can implement the feature in your own projects.

Learning oEmbed: Convert Links Into Embedded Content [WoorkUp]

Internet Explorer Should be Powered by WebKit

Internet Explorer + WebKitThere, I said it.

If Microsoft were to switch from their proprietary “Trident” rendering engine to an open source solution such as WebKit or Mozilla’s Gecko, it would do far more than simply save designers headaches.

It would save Microsoft money and development time, net them some publicity, and vastly improve their web browser? What’s not to like?

What is WebKit? It’s an open source HTML rendering engine that powers Google Chrome, Apple Safari, the iPhone’s MobileSafari, and just about any Mac OS X application that displays web pages.

Internet Explorer could at long last become reasonably standards compliant, and Microsoft would be able to put their resources towards improving their browser’s user interface, rather than wasting time reinventing the wheel.

Maybe it’s just wishful thinking, but there is no reason it couldn’t be done.

How Much Longer Will IE Last?

Internet Explorer CSSInternet Explorer is notorious for it’s laughable support for W3C standards. Look around in the web design community and you’ll find that a lot of designers do not like the browser one bit, as a result of having to find workarounds so a page that will display in most other browsers will work in IE as well.

Security isn’t exactly the browser’s strongpoint either, as the public is becoming increasingly aware of.

Microsoft has been losing market share in the browser area for the past few years, as people move away from IE. Slowly but surely, IE’s userbase is decreasing, and other browsers are picking up the switchers. Internet Explorer has 43.6% market share as of February, down from the 54.7% early last year, or the 91.1% from early 2005.

Firefox is up to 46.4% market share now, while Chrome, Safari, and Opera collectively have roughly 7%. (Chrome has shown very fast growth considering its age.) The general public is becoming more aware of browing alternatives, and the security benefits of switching to them. More people are buying Macs too, which include Apple’s Safari browser instead of Internet Explorer. People are learning, and moving away from IE.

Continue reading →

Only 4.13% of the Web is Standards Compliant?

Browser maker Opera has conducted a recent study to see how much of the web is standards compliant. Using a specialized web crawler, dubbed “MAMA” for “Metadata Analysis and Mining Application,” that searches around 3.5 million pages, the company has determined that a mere 4.13% of the web is standards compliant.

Of course, one wonders about the accuracy of this study. There are certainly more than 3.5 million pages on the internet. Perhaps they were only searching a portion of the web that had less valid pages? And does a site with 100 non-compliant pages count as 100 invalid pages? How many of those sites are invalid because they try to comply to Microsoft’s bogus standard (a.k.a the “does it look alright in IE?” standard) at the same time?

I can understand the small figure, and maybe it is realistic. After all, many a website almost validates, such as Reddit.com, which has one lone (and minor) error stopping it from validating. And heck, Google and Amazon are validity-challenged. Amazon has “1445 Errors, 135 warning(s)” on it’s front page.

Many monolithic sites that you’d think would validate don’t, though they look fine in most browsers anyway. This brings up an interesting question: Does it matter whether you meet the standard to the letter, or is it okay if it looks fine in all of the standards-compliant browsers? What’s your opinion?

News article: Opera study: only 4.13% of the web is standards-compliant

Interesting Reddit Discussion: http://www.reddit.com/r/programming/comments/77grk/

Oh No They Didn’t! Microsoft and Web Standards

Remember the big deal Microsoft made about how Internet Explorer 8 would finally be standards compliant. Aside from some odd stuff they were doing, it looked like they were actually putting in an effort to follow through with their promise, or at least something close to it.

Apparently, the a lot of of web pages will load in IE7 mode instead of standards mode. The Register has the full details.

This week, the promise was broken. It lasted less than six months. Now that Internet Explorer IE8 beta 2 is released, we know that many, if not most, pages viewed in IE8 will not be shown in standards mode by default. The dirty secret is buried deep down in the «Compatibility view» configuration panel, where the «Display intranet sites in Compatibility View» box is checked by default. Thus, by default, intranet pages are not viewed in standards mode.

So all intranet sites will be shown in non-standards mode. Then we have all the version targetting nonsense they’ve been planning.

Oh, and guess what happens whenever a page loads in standards mode? A little icon appears showing a broken page. When clicked, it forces the page into “IE7 compatibility” mode. So the browser tricks people into not using standards mode.

Continue reading →

Internet Explorer 8: The Next IE5?

I installed the Internet Explorer 8 beta a few days ago, and I’ve tested some sites in it. So far, I’m not really impressed. It seems to pass the ACID2 test, but there are plenty of rendering bugs that drive me crazy…and they had better be fixed by the time the final release is out.

I’m very well aware that the browser is in beta, but I can’t help but be worried about this. Some pages seem to render worse than ever, and I can’t help but think “Are these bugs, or some sort of ploy to keep things as they’ve been?” It’s not really in Microsoft’s best interests to be fully standards compliant, after all.

Here are just a few examples of the render bugs I’ve noticed:

Continue reading →

On IE8’s Controversial “Standards Mode”

Internet Explorer version 8, to be released later this year, will, by default, render web pages the same way as IE7. If the meta tag <meta http-equiv="X-UA-Compatible" content="IE=8" /> is detected in a page’s header, it will render in the new ACID2-compliant mode. This is a bad idea, for several reasons.

First of all, Microsoft, again, is trying to force us to build web pages for IE specifically. If the X-UA-Compatible tag is not found, then a page will not render in the standards-compliant mode. So, basically, you’re having to specifically instruct IE to follow the standard, which it’s supposed to follow anyway. Does that make any sense? No. It seems like another ploy to maintain their monopoly. After all, if most pages work in any browser, there is less reason for people to continue using IE.

Microsoft claims that the meta tag exists so manually-updated web pages won’t break when IE8 comes around. Manually updated? If you’re still doing that, you’re just asking for trouble. Take this as an opportunity to move away from a manually-updated site. While you’re making your design standards-compliant, install a CMS, or at least start using PHP includes. Also, as Bb’s RealTech said:

“The argument for this tag is actually the number one argument against this tag: those people with hand crafted pages are not going to be willing to hand edit each page to make it standards compliant–why on earth would they hand edit each of these pages to add this tag? As for being able to test a site against a version of a browser–this site looks good in IE7, but not IE8, or some such nonsense–when are we finally going to actually commit to standards? Not just as browser vendors, but as web page designers and developers? More importantly, as people who use browsers to surf the web?”

Instead, IE8 should check the DOCTYPE. If it’s XHTML Strict, then use the standards mode. If it’s HTML transitional, or absent, use the IE7 engine. I’m guessing that a lot of the people who would run into the problem Microsoft talks of wouldn’t be using XHTML. It’s much easier to manually-update HTML than strict XHTML after all. Or IE8 could validate the entire markup in-browser, and if it’s not standards-compliant, render in IE7 mode, and display an icon to show that the page isn’t compliant. At the very least, they should make the standards-mode default, and have the meta tag specify IE7 mode instead. Microsoft needs to base their trigger off of an existing feature of (X)HTML, rather than inventing proprietary tags.

Edit: I wrote this post before Microsoft changed their minds. You read my mind, didn’t you, Microsoft? But couldn’t you wait until the post went live?