Archive

Posts Tagged ‘safari’

Nasty Safari bug not fixed since December :(

March 26, 2009 33 comments
A rotten little apple

A rotten little apple by Ashley Harding

Apple has had a nasty Safari bug since December which breaks SmugMug, Facebook, Gmail, and lots of US banks.

3 months later, it’s still not fixed. Your only option is to use Firefox if you’re affected.

Apple’s known about the problem since December, and has lots of internal bugs on the issue (30+ I last heard). (For my Apple readers, here’s our bug on the subject and the one it was marked as a duplicate of).

I’ve done everything I know how to get this resolved – Apple employees have been internally working on our behalf, we have an AppleCare Enterprise Support case open (#332101), I tried to open an ADC Premier case (it was denied because they don’t “provide code-level support for content creation issues” whatever that means). Still no luck.

Apparently this is fixed in Mac OS X 10.5.7. We have the latest seed, so we’re going to find out, but 10.5.7 is likely a month or more away from shipping, so expect this stuff to be be broken at least until then. Use Firefox.

Safari happens to be my favorite browser, so this is especially disheartening. The good news is you may not be affected. Not everyone running 10.5.6 with Safari is, for some reason, but lots of you are. You’ll know you are if you see SmugMug galleries which appear to be empty (but aren’t) or see ugly white pages with undecipherable error messages. For that, I apologize – I really wish we could help but we can’t. You can help yourself, though – use Firefox.

Put another way, Eric Schmidt, Google’s CEO, sits on Apple’s board of directors. Gmail has also been broken for 3 months. Apparently he’s powerless too. 😦

Nasty Bug: Safari doesn't cache stuff.

April 4, 2008 30 comments
Strolling - Nairobi State Park by Simon Barnes

photo by: Simon Barnes

I swear I’m not making this up.

I couldn’t believe my eyes when I found it. Safari is one of our favorite browsers, and we love their work on standards compliance and speed, particularly JavaScript, but this particular bug is really driving us crazy. I’ve logged it with Apple (#5786274), and a fix is promised, but in case you’re getting hit with this and are as baffled as I was, here are the details:

  • If your computer has less than 1GB of RAM, Safari fails to cache items larger than 104,857 bytes.
  • If your computer has more than 1GB of RAM, Safari failes to cache items larger than 209,715 bytes.
  • JPEGs, at least, are temporarily cached in RAM. Whew. But upon browser restart, you’ll see they didn’t make it to the disk cache, so you have to get them again.
  • Other objects, like SWFs or videos, though, don’t even make it to the RAM cache, let alone disk. Load the same SWF back-to-back, and you’ve just transfered the bytes twice. Ugh.

Very easy to reproduce yourself from the comfort of your own home, so go for it. Just fire up HTTP Scoop or Wireshark or tail your server’s HTTP logs and start hitting stuff. Marvel at the # of excess bytes transferred across the wire that you didn’t need. 😦

Here are a couple of test URLs so you can see for yourself:

As a self-professed Apple fanboy, I can’t wait for a fix. In the meantime, we’ve had to jump through all sorts of hoops to ‘dumb down’ some of our most exciting new features. 😦

UPDATE: Yes, I’ve tried with every Cache-Control and Expires header known to man. No, it doesn’t make a difference. Try it yourself.

Thoughts on the new IE compatibility switch

January 23, 2008 11 comments

Over on IEBlog and A List Apart, they detail a new flag for the upcoming IE8 that would enable you to “lock” the browser down to older versions should you be expecting older broken behavior from IE6 or IE7.

This is a bad idea. The Safari team has a great write-up about why they think it’s a bad idea, which I agree with, but I also have an additional take:

Pages and sites that are likely to care about this are poorly written and poorly maintained. Microsoft created this problem themselves when they let IE6 sit idle for more than half a decade, and now they have to deal with it. Instead of letting someone flag their site as being broken (that’s what they’re doing), why shouldn’t they finally force them to fix their site and improve the browsing experience for everyone (not to mention improve the stability, speed, and maintainability of their codebase)?

If someone owned a car, but didn’t know how to drive it properly, would we bend the driving laws to let them on the road? Of course not. Some reasonable adherence to standards and moving things forward is the only thing keeping the web browser mess from descending into pure chaos.