Reduce automobile fuel consumption by 25%.

The government should mandate that all new motor vehicles sold after a certain date have continuously visible fuel consumption gauges in the same way that they have speedometers and odometers. That is, an indicator of the following would always be visible in the dash:
  • instantaneous fuel consumption (right now.)
  • fuel consumption tied to the trip counter (what´s my mileage since I hit reset.)
  • overall fuel consumption tied to the odometer (life of the car.)
  • tire pressure warnings about low or high pressure in the dash.
It is estimated that improper tire inflation pressure make you use between 3 and 5% more fuel. Jacques Duval, a car expert in Québec, recently performed a media demonstration of instantaneous fuel consumption as a way of making people sensitive to how their driving habits can cost an additional 20% to 30% in fuel consumption. If drivers always see their mileage in real-time, it will train them to adjust their habits. Just saying it is not enough, folks need the continuous re-enforcement that comes from data visible in real-time.

All modern automobiles use fuel injection under computer control. It is trivial to extract fuel consumption from any such system. Correlation with the odometer information to derive fuel consumption is equally trivial. The hard part is the ergonomics of displaying the data on the dash. This was obvious from two aspects of the reportage. First, the automobile used in the test had an onboard computer with such display options, and it was a modest car (a Chevrolet), the problem was that there was much pushing of buttons to have the appropriate fuel consumption be displayed on the dash. A constant display would be far better.

In any event, the cost of implementing this feature to automobile manufacturers will likely be negligeable. The use of a three year time span allows the manufacturers to incorporate the new requirement into their next review of their product lines, further minimizing implementation cost.

Automated tire pressure monitoring systems are already present on some luxury models. Monitoring of tire pressure is a safety concern, as well as one for the environment. Cars with incorrect pressure can show poor manoeuvrability and traction as well as the fuel consumption penalty. Verification of tire pressure, is a relatively time consuming and oft neglected chore. For those with poor mobility, such as the elderly, or those who will not normally perform such tasks (such as my wife, who does not want to touch the dirty wheels or crouch beside the car in the winter with the slush.) automated monitoring is a boon. Admittedly, this requirement might add cost to automobiles. One hopes that as it becomes a mass market item, the additional cost would be minimal.

From Achilles Michaud's report, a Ford Focus produces 3.7 tonnes per year of CO, while a Ford Explorer produces 7 tonnes. If we take an average vehicle at being 5 tonnes per year, then a twenty percent savings from these two measures would meet the 'one tonne challenge' all on it's own. The goal of this suggestion is to ensure that drivers have real data on which to make continuous decisions, and promote real progress towards meeting Kyoto targets in a sustainable fashion.

The followup to Kyoto is coming to Montreal:

The Jacques Duval & Achilles Michaud media report:

The one tonne challenge:


Bandwidth is a utility.

Telephone used to be analog and application specific. Television used to be analog and application specific.
By application specific, I mean that your phone never used to serve web pages. and you never expected to order pizza over your cable television. Both technologies are now fully digital. Beyond digital, both technologies are now TCP/IP based (use packets, and the protocols that underlie the internet.) Once things are digital, they are no longer application specific. It makes no sense to have a network for a single application (phone is a single application, as is cable television)

Technology is improving at a rapid rate, and in the next few years, the phone companies will be rolling out fibre to the home. They will do that because cable companies have a much higher bandwidth medium (coaxial cable) to work with, and can offer to replace phone service for cheaper on their cables. In contrast, traditional copper cabling cannot include nearly as much signal as coax, so the phone companies have to roll out fibre or risk being driven into oblivion by everyone cancelling their phone service and using only cable (because phone service will be much cheaper there.)

But the fact remains: With Internet technologies, you do not need to have multiple networks connected to your house. Given a choice, why would you have multiple networks running to your house? Not only is there no reason but it will be quite expensive. Over the next twenty or thirty years, places are going to lose networks. Enough people will switch away from the phone company (or cable company, once phone companies start offering cable tv over fibre) in some areas, that it will become very expensive for the phone company to provide service there, gradually overlap of the cable and phone networks will decrease, and eventually we will settle into comfortbale oligopolies, where single companies have the only network that covers large areas.

That future is almost certain to occur in the US, where internet service has been completely deregulated. Not only will you have only a single network provider, but that network provider, without competition in the local area, will be quite expensive. The natural number of physical network providers in any given area is 1. Internet bandwidth is a lot like electricity that way. Sure you can have multiple sets of power lines, but that is going to be hellishly expensive if your clients are not huddled together, hopefully near the point of generation. So now we should look at the concept of cities providing bandwidth, much like most cities provide water, or local authorities provide power (Hydro Quebec, Tennessee valley authority, etc...) That is probably the route that makes the most sense over the long term, the alternative being a bell-style regulated utility.

OK, so basic economics points to losing expensive extra networks. A basic thing that an oligarchy of private networks will want to do is packet preferencing and packet filtering. Today, I run a mail server out of my house. Most people cannot do that because their ISP agreements prevent them from running 'servers.' Anyone using internet phone service is very likely to be running a server, and very likely violating their ISP terms of service. What the ISP's want to do is sell your their own phone service, their own email service, their own web-hosting service. What people do not realize is that you can do all that in your home, for nothing. As long as networks do not do packet preferencing.

Today, it takes some geek knowledge, but there is no cost involved. The major ISP's have already killed competitive email solutons by blocking port 25 (the mail traffic port) and is fighting against providers like Skype, to try to keep their customers from being able to get voice communications from someone else. The only way that this will happen is if the networks are unregulated and permitted to continue to filter, and prioritise according to their corporate interests. Dropping voice traffic at the gateway is in the corporate interest of your cable company. It reduces load on their internet link, and enables better service for the cable company's own clients. But it is deeply wrong. It is as if Westinghouse were your power company and permitted only Westinghouse appliances to be connected to the power. GE would be out, Frigidaire too.

Clearly, what you want is a vendor neutral internet, where you can buy services from whoever you want, and even build your own if you are the DIY type. No private company will want to give you that freedom, unless there is coecion of some kind. Government regulation could do it, but competition from a city or area run non-profit with the public's interest at heart could probably force the for profit corporations to be civilized.

The real question is how can the most people get provisioned with vendor neutral bandwidth for the lowest cost to the consumer and the economy. I very much doubt that that will happen in an unregulated economy because the economics push towards a natural monopoly, and a for-profit monopoly does not drive efficiency.

Ordinary people should want standardized WLANS

Wireless communications could be a lot better if they were standard. If a large market (practically: either US, Europe, or Japan, or China) would pass a law saying 'all consumer electronic devices must communicate with ieee802.11 WLAN protocol, by the year 2015.'

Sounds like gobbledygook, right? OK, need some de-geeking...
What's a WLAN? It stands for wireless lan, and the 802.11 standard described is why you can use a wireless base station from one company with a computer from any other company. A protocol the language that is used to trade information between two computers. Today, a remote control talks to a tv using a signalling system made by the manufacturer (or some sub-contractor) to communicate with their own controller (or that made by a sub-contractor.) No other device can trade information with your TV.

A cordless phone talks to it's base station using another private signalling system, wireless weather stations, and many other thingums use their own private systems. There are government bureaucracies in most countries (FCC in the US, CRTC in Canada) which say to manufacturers: you can only transmit at such and such a frequency, at such and such a power level. The frequencies are treated as a kind of real-estate. There was a very good reason for this sort of management in most of the twentieth century. People were making radios that talked over each other, and interfered with each other. RADAR works on the same principle as broadcasting. If there was no management then you would have snow on your tv screen everytime the local airport's RADAR swept in your direction. If you think these sorts of problems are imaginary, read this brief account of keyless car locks going nuts near military bases. These sorts of problems result from folks thinking that low power short range communications should not interfere with anything. What nobody counted on was that RADARS and jamming equipment are, by their very nature, very powerful transmitters, and can over power low power transmitters that are very far away.
This sort of problem happens because the devices in question are very simple.
when computers communicate, there is a famous (ok.. famous among geeks) seven layer model ( ) There is a physical link layer, which could be over radio waves, over wires, glass fibre, or whatever. Standard electronics for the medium takes care of actually sending and receiving signals. Another basic concept that shows up at this level is 'packets'. Packets are messages limited in size by the medium being used to send them. Packets are usually quite small, they have a clear radio signature at the beginning, a clear radio signature at the end, and the middle has some rules, about how you store information in them: say the first part of the packet should say how long it is, and at the end you will often have a check sum, which is a simply check to see if the packet got clobbered in transit. If any of the rules (start tag, end tag, size should make sense in reference to the start and end, the check sum should match what we received.) are broken, then the physical link layer will normally discard the packet as invalid.

Sorry... what? ... It means that if someone points a RADAR at your car, then with early keyless entry systems, the car just reacts to a fairly simple signal. It will 'see' almost all the traffic, and eventually, just through dumb luck, random noise will be 'decoded' by the receiver, and the trunk will open because of the RADAR. Now theses things are slowly getting smarter, but they are really just re-inventing the wheel that already exist: wireless LANS. A wireless lan, on the other hand, will try to put any signal it receives through the packetization engine, and throw out almost all the random stuff as garbage.

The stuff that makes it through the packet engine, will then have to go through security mechanisms of a wireless lan. There are lots of mechanisms, but basically, the idea is that the base both the remote control, and the reception unit in the car 'know' a 'password.' They use the password to make the message unreadable. Someone who knows the password can make it readable again. So once you get a packet, wireless LAN hardware will try to make the message readable again, based on the shared secret. If it doesn't work, again, the packet is thrown out. The resistance to interference comes, not from having a really good radio, but from how the signal is structured, to make it nearly impossible for random interference to be understood as real commands from a remote that a device should listen to. The shared secret makes it reasonably hard for your neighbour to change your tv's station.

OK, but what about interference? Early cordless phones would buzz when you used them near appliances. They used a single frequency, like a radio or a television, and if an appliance made noise on the frequency, you would get buzz. After a while higher frequencies, like 900 MHz came along, and they moved to 'digital signals' (which means they make it into packets... but each maker does it their own way.) which made things a little better, but there were still problems with interference and crosstalk. So then 'Digital Spread Spectrum' came out. What's that? Well, it means that instead of using one frequency, the cordless phone and base station listen for other radios on a bunch of frequencies, and avoids interference by using (aka "hopping" to) the ones with the least noise.

So when a military RADAR scans at your car, if the car were using wireless LAN technology, the car starts discarding 99% of the packets coming on the frequency it was using, assumes there are others using it, and the remote and car switch to a less crowded one instead.

A radio that has things like frequency hopping, packets and security is what you need when there are a lot of devices sharing the same radio space, without causing gaps in conversations, or trunks of cars to open at random intervals. Common devices can share less radio space, and we can reclaim the frequencies allocated to other uses. Another benefit of this technology is that wlan radios adjust their power levels to only send at the minimum power required to communicate. Those who worry, rightly or wrongly about long term exposure to radio waves, can take comfort that smart radios will use less power, and the total amount of radio transmissions will be reduced by all devices being able to share a single base station.

Today, the hardware for a wlan interface costs on the order of 20$. This is a lot for some forms of consumer electronics (think DVD remote control), but standardization will also drive down costs, since the market for them will be vast, and no-one will have to develop company specific radio hardware, and all devices will be able to listen and talk to eachother.

So it won't cost much, it will allow us to use radio frequencies for other purposes, but what is the win for the consumer? Well, using the same system as computers means you have a gateway to computers. Anything you can send over this short range radio, can be sent to a common base station, and then sent anywhere over the internet.

OK... instead of a cordless phone with a base station for that brand of phone, all the cordless phones ( like this one... ) will work with any wireless base station. They use Voice over Internet Protocols, and soft PBX's (like this one: implemented on computers to give any home a complete industrial strength phone system. No such thing as a busy tone. Someone phones your home to talk to your teenager. Your spouse is conversing with his/her mother and you can still receive calls at the same time. There would be a few tests to pass before the phone actually rings in the house. Phone numbers of people you know would be let directly through, others would have to answer some questions first. Gone are the days of heat pump salesmen interrupting your dinner. Cell phones are basically history. People will have bandwidth available everywhere, and calls will be free.

If the car and the house speak to each other, you tell your home entertainment system to send the new disney movie to the car, so that the kids can watch it on the way to grandma's, don't have to carry anything. You can use the tv remote to start the car so it is warm when you get there, and find out if you need to get gas, and if the tire pressure is low.

It is hard to come up with good examples of how this will change everything, but it will do it, in some way that we cannot forsee right now. It would be a multi-billion dollar win for society, but it needs a network effect to get started.

This is all easy stuff to do at a technical level. It isn't hard, it just needs people who are in different industries to talk to each other. If you take existing wireless lan technologies, layer web servers on top of it, and XML for communications, then all of these things are just a matter of agreeing on things. proposing a law or a regulatory requirement might give just the push we need.


Cars Should Tell You What is Wrong

I'm kind of wondering about car maintenance because I drive a nine year old car with 211,000 kms. on the odo, which runs fairly well, does not have too much rust (this is Quebec, they salt the roads, the cars rust rather quickly.) but, is kind of disheartening. The Engine Check light comes on whenever it is damp, and goes out after a couple of days. When the car was young, I used to take it to the dealer, and they would kind of shrug. The computer would be telling them to replace some 500$ part, and they knew it was kind of bogus. After a few years (it happenned once every six months or at the time.) They figured out it was the ignition cables, that had issues when it was wet.

Once the alternator died with no warning. Another time, a radiator hose broke. I didn't like much being stuck by the highway with the wife, the children and the domestica animals. You know, I remember back in the seventies and eighties, this was kind of normal stuff, and you just dealt with it, and I am. But cars have become a sort of utility, especially in single car families.

The automotive industry today tests for durability, and they have other concerns, like reducing weight to improve fuel consumption, and ensure that things crumble optimally in an accident. The durability tests for components, in my experience, mean that, at around 180,000 kms. stuff starts to break. regardless of make/model. This will only get worse, as car components are increasingly being cost driven and the same contractors sell parts to multiple suppliers. Brand name is less and less an indicator of component quality.

This tells me that 180,000 kms. is about the limit of durability one can expect when keeping in mind cost and other factors. For the car maker, this is past the reasonable point of testing, and they probably want you to buy a new car. It used to be that a reasonable person with a screw driver, a feeler guage, and a few wrenches, could maintain a car literally forever. Those days are long gone. They have become far more complicated. For owners, there is the obvious cost reason to keep the car past that point, but there is also the planet to consider. Sure cars can be recycled, but it is probably much better to just maintain the car you have (assuming it isn't a gas guzzler.) If you want to keep a car beyond that point, the most important thing to you is: how easy are these cars to diagnose and repair? Because that tells you how much time and money you will be spending on maintenance, which is the only major cost once the car is paid. It has become less practical over the past few decades, to keep cars as long as they will last, paradoxically, because QA testing and engineering have improved, so that whereas before, a component would be over-engineered and last forever, now it is made to be 'just right' (not too heavy, not too expensive, not too... durable.)

This should not be that hard. If you design the vehicle with enough sensors, and enough software, it would very likely do a great job of diagnosing itself. It is a good economic investment for the owner that plans to keep the car for a very long time to have really good sensors and diagnostics, because then a repair for an obscure electrical problem will be 1 hour's labour instead of 6. Maintaining an older car could be relatively easy.

Today, I've seen it time and again, people do not get electrical problems fixed simply because it will be too expensive in terms of labour. This should be much simpler. The problem is how to get the motivations in place so that there is an economic incentive for car makers to making older cars easier to maintain.