www.satn.org
|
||||||||||
|
|
|||||||||
|
Comments from Frankston, Reed, and Friends
|
|||||||||
Saturday, February 09, 2002 BobF at 5:56 PM [url]: Rainmaking? This is a copy of a letter I sent to David Farber's "Interesting People". I'm holding it to see if it gets republished or not. It is based on his posting of a Cringely Story. I also sent a longer follow-up letter after I got Mark's Laubach's book Breaking the Access Barrier: Delivering Internet Connections over Cable by Mark Laubach, Stephen Dukes, David Farber that went into a little more detail as I noticed that I knew the most of the authors of the book as well as the editors of the series. The book is very good and my quibble is not with how to work around the problems with using the cable system as a tansport -- it's just that this is a temporary kludge, brilliant though it may be. The problem with Rainmaker and other technologies is that they are only valuable when the address the real limits. Running high speed on a segment of a network is interesting without addressing the architecture of the cable distribution system and all the devices connecting the segments. The real limiting factor is in the lack of (marketplace) incentives to use the technology as part of creating innovative solutions. It's like deploying fire hoses that can carry ten times the amount of water than is available from the pipes and then wondering why one can't get 100 times the amount of water by just deploying 10 such hoses. As the HDTV and ITV failures (or simple lack of deployment) have shown there doesn't seem to be enough additional revenue in delivering more. On the other hand if the architecture of the network gave the users the capacity to create their own services then we have something very interesting. But we already have the 10x improvements in technology, if not the 1,000,000x. The availability of connectivity is restrained by legacy business models, not by technology. More on this in the connectivity essay. BobF at 12:19 PM [url]: Letter to the editor of ISP-Planet Just a quick note -- I responsed to an article in ISP-Planet about issues realted to connectivity. They have posted it as a letter to the editor. Thursday, February 07, 2002 BobF at 1:41 PM [url]: For those who like to see pictures on the Internet This is the mini book PC sitting in my kitchen. No, that's not a reflection on the screen -- just the desktop background using another picture of the computer close up. It's sitting on top of a small auxiliary refrigerator which doesn't really have space for the screen and the keyboard and the mouse let alone a computer. But it is sort of workable for now. One observation is that the wires are a real problem. As much as I like the choices of connectors a single gigabit Ethernet should work for all the external streams. This includes that huge video connector. And if a gigabit isn't enough then we can focus the effort on increasing the capacity of that one medium. A lot better than all those twisty wires and the clunky video cable that is hard to bend. BobF at 11:59 AM [url]: Mini Book PC [Note -- as much as I preapologize for typos and spelling errors I usually impose upon Dan (Bricklin of course, after all, whose blog is this?) to proofread. But I mentioned this on David Farbers IP list and want to respond to the questions.] I'm having fun playing with tiny machines which, of course, means I have even more distractions to keep me from writing up comments on my past distractions. The particular machine I have is the "Mini Book PC". I bought mine through TigerDirect but they no longer carry it and it took six weeks to get it. But I've looked around and found it on a number of other sites. It is manufactured by Saint Song in Taiwan. While I get mine with 128MB (I've replaced that with a 256MB SODIMM), 10GB drive and 1Ghz Celeron for $500 I think that may have been a low price because of early lack of demand but I think that there is a very interest niche for this kind of machine. If use Google you'll find a number of places to buy it. You can use a Pentium III instead of the Celeron (though I can't say exactly which versions offhand) and add 20GB or more since it uses standard notebook disk drives. The unit itself is about 5" by 6" by 2". There is also a smaller version! That's the Espresso PC (the one I have is called the Cappuccino CG1) and doesn't have the internal CD (or DVD -- it's a laptop-type docking bay). I'm excited because it is small and fairly quiet but has a full array of ports and connectors: VGA, TV out (s-video), USB, Firewire, Ethernet, Modem, Keyboard, Mouse, Serial, IR, and even a volume control so you can use it as a portable CD player not much bigger than a standard one (OK, thicker and at 1.9 pounds it is heavy for jogging). I'm running XP on mine though, obviously, you can run Unix (Linux for those who don't generalize) and other operating systems. I've long been arguing that the PC is more than just a mainframe on a desktop and is the building block for all sorts of other devices. While still too expensive for most people to throw them around casually it is a hint of what is coming very soon. IBM recently announced a small machine but they think of it as a research project and IBM said "MetaPad is a radical experiment in form factor and is furthering our understanding of how humans can better interact with their information". I find this very strange since their machine is only a little smaller than the ones already shipping and there are too many obvious uses. Yet IBM seems to think of it only as a traditional PC. In fact, it challenges the notion of the PC created in the image of the old time sharing systems. While I come from the environment and feel comfortable with the concepts I also find it frustrating as a try to deploy the machines in new environments and applications where there isn't the right fit. This is a challenge to both the Microsoft world and the Unix/Linux world. Microsoft has the added dilemma or trying to maintain a high price computed on a per-employee basis. But this discouraged the deployment of systems around the home. WinCE is intended for the niche but is significantly less capable. If we add the C# and other CLR (Common Language Runtime) aspects to this the story becomes even more interesting. I'm still learning more about the CLR word. It's more often thought of as dotNet but the webservices spin is fraught with many problems and the basic architecture is far more interesting as a processor and even OS independent environment that builds on many ideas, including those in Java but goes further. All of these are intriguing aspects which I hope to write about if I can tear myself away from the fun of discovery. The Linux world has its own challenges since so much attention is devoted to Microsoft-envy rather than rethinking what an operating system should be. Again, lots of interesting things to write about. When I was championing home networking at Microsoft I anticipated these kinds of systems. In fact, if we go back to the 1970's it was obvious we'd be getting small powerful computing engines we could deploy. These are still far more powerful than we really need but for exploring ideas it is often better to have too much power so we can concentrate on the applications. The danger is to become addicted to the big iron (even in a small form factor) that we fall victim to the complexity and overhead. Ideally anything we can do in one of these can be done in a single chip once we have a better idea of what we need. But going to the single chip prematurely is like premature optimization -- it is too easy to become obsessed with saving bits rather than solving the problem. Yeah, I know, I did that with VisiCalc and sometimes you just get lucky. But even then, it was only after a decade of using the full power of mainframes to learn. Tuesday, February 05, 2002 DanB at 1:09 PM [url]: Visit to CIMIT In a posting on my personal weblog yesterday, February 4, 2002, I describe a visit to the Center for Integration of Medicine and Innovative Technology in Cambridge, Massachusetts. The posting has several pictures and describes some of the projects. What I didn't mention is that as part of one project they are working on software implementations of Wide Band Radio that let them work with all sorts of different transmitting sensors in the operating room transmitting with all sorts of protocols. They're also figuring out how to run many wireless devices and protocols at once in close proximity, putting into practice the problems and solutions David Reed mentioned here on SATN about spectrum. I think it's related to the SpectrumWare project at MIT (John Guttag is involved in both). If you are interested in "software radio", check out the Software Radio Resources Page and Guttag's "Communications Chameleons" article from the August 1999 Scientific American. Sunday, February 03, 2002 DanB at 11:38 AM [url]: Some thoughts on Enron As a person who does a lot of work with usability, I found the Butterfly Ballot to be a very interesting topic, and found ways to tie it to the computer field. Similarly, as an MBA, I find the Enron situation fascinating. I get to use my finance and accounting training to understand something important related to our general world. I've been following the story closely in the New York Times and the Wall Street Journal. In last Friday's WSJ, in a story starting on page 1, it says: "They [an Enron team] figured that for Enron to grow quickly, it couldn't be weighed down with debt. Too much debt would threaten the company's credit rating and make its financing costs higher." So they structured things so the debt wasn't reported. Now, why does too much debt affect the credit rating? Because with a volitile business like Enron's, more debt brings on more risk. By hiding the debt they weren't getting rid of it. The debt was still "there" and so was the risk. In fact, because they hid it so that their stock price was artificially high and the stock price helped them secure the debt, they were actually adding to the risk. If they didn't understand this, it sounds like an infant who thinks if they can't see something it disappears and ceases to exist. We play "peek-a-boo" to teach infants that things do come back. If the Enron people thought that there wasn't a reason that risk needs to be taken into account, they shouldn't be investing money. This brings up an issue. A major problem with Enron was that people outside the company did not know what was going on. The accounting reports didn't tell enough. What it comes down to is that to really know what's going on, you really need access to everything and make up your own mind. This idea of showing as much as possible is called "transparency" when we talk about globalization of finance. Countries and companies who don't tell all you need to know aren't transparent enough, and can't be trusted. As we see, people are more likely to do things you wouldn't approve of, or be less likely to have mistakes caught, if they can hide what they did. The software connection? Transparency is important here. If you want to trust code, you really need to be able to check out everything. The best way is by being able to examine the source code. Bugs and Trojan Horses are harder to find when you can't check the source. So, you could say that Enron points to the need for Readable Source. While Open Source is one type of Readable Source, it is not the only way. Some of the remedies mentioned in the Microsoft case, that maintain Microsoft's ownership and control of what's in Windows, speak to Readable Source and address other methods. Sure it's nice to have the simplicity of normal accounting statements and API documentation, but there are times when you want it all so you can dig deeper. |
||||||||||