IE7

Downloaded IE7 beta today. I was interested in seeing if it would render gtraffic.info in the same way as Firefox i.e correctly. I am sad to say that it _still_ doesn’t fit the site into the page correctly. They have decluttered the interface and added a completely pointless zoom in/out function for the webpage.

Update:
I have been looking around and this page has a more comprehensive list of IE7’s rendering faults. Basically Microsoft haven’t bothered their arse to fix _any_ of the existing html and css bugs. The only thing they have added to the rendering engine is the alpha channel support for png’s. So this means the only reason for IE7’s existence is to try and stem the flood of people who are switching over to Firefox. Having had a chance to play around with IE7 I quite like it now but the rendering engine stuff is a pain in the arse.

Curse of the MiniDOM

I have spent a depressing evening trying to find out if there was a mechanism using minidom or pyxml to load the bbc traffic tpeg data into a DOM and get it to resolve the node entity values. No luck. I can’t even use minidom to parse the entity file or DTD directly as it rejects it. The only way I think I can get this working is to

– read the entity file line by line and generate name pairs from the definitions using regex to match valid entity defns.
– (somehow) load the xml file into memory, perform a text replace on all entity references from the name pair dict and _then_ pass this to minidom and my parser code which I have built line by excrutiating line.

I don’t like this approach but I have set myself the goal of doing this so I will do it.

Latest News

I am _still_ working on a Python DOM parser for tpeg. I got so far and then realised that the minidom library which I have been using does not load the enumerated types which specify a great deal of the fields (like ‘severity’, sheesh). If I understand it correctly (there is very little help around on this stuff) it is because they are not defined inline to the tpeg file itself but are reached through a convoluted route through the DTD’s.

There is a solution to this which is to parse them out of their source file first and then do a text replacement for the corresponding codes in the target tpeg file but this solution is so horribly inelegant that I cannot bring myself to do it without at least trying to find an alternative.

I have had a suggestion about creating permalinks to region layers which I believe is doable so will be looking into this.

Google have updated their map API and apparently are ditching the use of ‘openInfoWindowXslt’ which my ‘createMarker’ function in gmaps.js uses. I need to look into this as I assume the new API will be made official sometime soon.

Roadworks Up

Okay, roadworks layer is back up again.

Still working on TPEG parser. The more I look at this the more I see I have to include. The version which generated the layers for the map is pretty simple in comparison as I leave out a lot of data.