The internet has been heavily disturbed this weekend due to a worm that has been spreading via unsafe MS SQL Server machines allover the internet. This has generated enough traffic to shut down or block 5 of 13 root name servers according to some reports and that certainly counts as hurricane strength as 'internet weather' goes. The slashdot thread on the issue is interesting and in particular it is interesting to learn that the worm has been able to spread inside one (1) UDP packet exploiting a buffer overflow. That's 376 bytes of very malicious code! (analyzed here).
The title is supposedly a quote from William Gibson. The future I'm referring to is effortless universal machine translation.
I've asked before about machine translation: But Does It Work? and got some 'almost there' if not so nice looking french/english going. Here however is a perfect translation success of this this spanish webpage. This translation isn't just a nice try, but a perfectly functional if not fully grammatical translation. It is easy to supply the corrections needed, and I don't speak a word of spanish so I'm relying completely on the translator.
At the 8-month mark for classy.dk postings the daily avg. has reached 0.82 postings per day (up from 0.75 at the six-month mark) indicating a very busy two months since then. At the same time we have passed 200 postings. Now I REALLY need to extend the offerings beyond weblogging.
I just remembered what Yahoo's recent acquisition of Inktomi reminds me of: It's like having bought Western Union to compete with the recently founded Bell Company. No amount of marketing muscle or corporate powerdealing can undo the superiority of Googles search.
Det er smuk demagogi. Et indledende afsnit kritiserer og relativerer den vesteurop?iske daglige verden - og den opfattelse den bygger på - som "sandhed", altså blot et verdenssyn, et politisk valg af virkelighedsopfattelse. Det er jo fint nok i det hyperkomplekse samfund, hvor ideer om virkeligheden er situationelt bestemte og ikke l?ngere er i stand til at overskride denne situationelle barriere og blive til en egentlig dyb, delt virkelighed. Men relativeringen varer desv?rre ikke l?ngere end til afsnit to, for Zapatisternes revolution?re "sandhed" er ikke en "sandhed", men derimed en vaske?gte sandhed - og ydermere er sandhed v?rdighed.
Artiklen mener tilsyneladende ikke at der kan v?re en iboende v?rdighed i den vestlige sandhed, eller den mexicanske regerings for den sags skyld.
[UPDATE 2007: Hvordan jeg i 2003 kan have overset at forfatteren til Zapata-teksten er Blekingegdebandens Torkil Lauesen kan jeg ikke forklare - men det forstærker jo kun opfattelsen af begrebet modmagt]
Joy of joys: my Google PageRank which was previously a positively lame 4/10 has improved to a merely ridiculous 5/10. This is still a lot worse than your average 'best hit' page on Google, but it proves that there is value in noisemaking.
Efter nogen års pause k?bte jeg i l?rdags Information for at se hvordan den tog sig ud nu om dage. Men jed blev slemt skuffet da jeg nåede til bagsidelederen af David Rehling.
Informerens chefredakt?r, David Trads, havde ellers lige - også på lederplads - fors?gt at s?tte tingene lidt på plads ved at minde om at der ikke er tusindvis af egentlige politiske fanger i USA (til trods for at andelen af f?ngslede er uhyggeligt h?j) og at man, magtanvendelse og enegang til trods, ikke kom udenom at de fundamentale principper på de kanter er frihed og sådan.
Bagsidelederen handler da også om David Trads' 'angreb' på venstrefl?jen omkring juletid og et debatm?de der har v?ret holdt i den anledning
Imidlertid forts?tter den ikke debatten, men vender ufortr?dent tilbage til en h?r af gamle flosker, forstenede fjendebilleder og udtjente ideer.
Lederen er sine steder hysterisk (ufrivilligt) morsom. Den påstår naturligvis at tage debatten alvorligt, men alligevel handler det hele om hvordan der kan 'etableres modmagt' til de siddende regeringer, og i et interview citeres Claus Bryld for at 'det kan k?re videre i 200-300 år', han er fortvivlet over at 'kineserne også k?rer med på vognen' og den eneste mulighed han ser for at standse 'den herskende nyliberalisme' er 'natur?del?ggelsen'. Et perspektiv Rehling lige lader stå et ?jeblik som en knusende slutbem?rkning.
Den magel?se historiel?shed der ligger i at foreslå at 'man må forvente en lang periode uden politisk innovation', og foreslå en skala på 200-300 år er uden sidestykke, og demonstrerer prim?rt at man må forvente en meget lang periode uden innovation hos Claus Bryld.
R?kker vi tilbage i tiden skal vi tilbage til samfund f?r dampmaskinen, de fleste v?sentlige udviklinger i europ?isk politisk filosofi, den amerikanske, den franske og den russiske revolution, udvandringen fra Europa, penicillinens opdagelse og tuberkulosens og koleraens udryddelse i Europa, atombomben, bilen, verdenskrigene, Beethoven og Elvis Presley og naturligvis classy.dk.
Det er tankev?kkende at den slags forudsigelser fra venstre side af det politiske spektrum lyder fuldst?ndig ligesom de dog noget tidligere og allerede forladte ideer fra konservative som Francis Fukuyama. Venstrefl?jen er blevet helt igennem reaktion?r. Hvis det ikke var fordi det var så sjovt at skrive lange passager hvor den slags vr?vl karaktiseres som det det er, så ville man bare tage sig til hovedet.
Og som om det ikke var nok så skal denne historiske overspringshandling opsummeres med at der selvf?lgelig er 'natur?del?ggelsen' at se frem til som et sikkert faktum.
Det eneste Rehling kan stille op er et fromt ?nske om 'modmagt' - og det lykkes faktisk ikke rigtig at få det til at lyde om andet end brostenskast ved EU-topm?der, som et udtryk for 'den virkelige opinion'. Det er for s?lle og faktisk pinligt for Information at skulle v?re spejderblad for den slags pr?k.
Hvad med at foreslå en parlamentarisk reaktion. Eller måske noget så us?dvanligt som en 'reaktion i livsform'. Det vil sige tage udfordringen op fra det der må opleves som en afparlamentarising ude på fl?jen i og med at magten med stadig st?rre sikkerhed samles i konsensuspartier i centrum.
Reaktionen på det er ikke boykot af californisk r?dvin i protest mod krig i Irak eller masseudflytning til halmhuse på M?n. Det f?rste er bare et surrogat for gamle virkningsl?se aktionsformer og det sidste f?rer til en stor mangel på boliger om 10 år når halmhusene allesammen er rådnet op.
Den 'symbolske handling' har mistet sin kraft hvis afparlamentarisering er et rigtigt n?gleord. Så må der noget egentlig gerningsforandring til, og denne gerningsforandring kommer ikke ud af stedet hvis ikke den starter med at fastslå at hverdagen er kommet for at blive og vi ikke allesammen kommer til at leve ?kologisk, så t?t på jobbet at vi kan cykle derhen.
A modification of the copyright system that satisfies both the need to keep the public space open and the rights of ownership for intellectual property would be extending fair use to fair recovery. If copyrighted material languishes in the private space of the copyright claimant and a case can be made that the property rights of the material are not being exercised fairly (i.e. at non-ridiculous prices) then the rights to fair use should also include the right to obtain the material in the first place.
Lessig points to a suggestion for a copyright extension tax which adds a small charge to copyright extensions. Failure to pay for the extension immediately makes the copyrighted material available to the public.
This accomplishes approximately the same thing, but not quite. First of all people might object to the tax as 'yet another way to put money in government pockets', and secondly - it does not prevent the censorship by ownership that is becoming commonplace. To me that is the real target for the activity to keep the public domain open.
You might argue that the cost of litigating to exercise the right to fair recovery would likely incur a higher cost than the copyrigh extension tax, but that goes both ways and it is entirely likely that it is cheaper for copyright holders to make their material available through some kind of library system if not to the general public. Many countries have rules like this for archival purposes anyway.
Tim Berners-Lee summarises what the web is all about here. And the message (mine, not necessarily his) to newspapers is simple:
I'll rephrase my position from a previous post.
Hypertext consists of text - with functionality added - in the form of links. It is the full thing that is the text, not just the words. The links are part of the meaning of the text not somehow 'a delivery device' or 'functionality' auxiliary to the text but not really text - hence speech - itself. On the other hand, the links are machineable, i.e. easily accessible to software. The machineability of the links is what makes them interesting.
There is an interesting but fundamental fact in the theory of computation: There is no fundamental distinction between program and data. What that means is that if you interested in producing a particular result by entering data into a computer program, you can be reworking the program and the data basically move any of the information involved from what is considered program to what is considered data and vice versa.
The idea is simple: To add two numbers a and b, you can either use your plus program plus to compute plus(a,b), or you can use your plus_a program on b:
plus_a(b) or your plus_b program on a: plus_b(a).
The example may look silly but of course this underlies all that we do with software, and the browser is a case in point:
Accessing web pages involves numerous formats of data, parsed and interpreted by a stack of processors. At the very least this stack contains at the bottom IP packets, on top of that TCP connections, on top of that the HTTP protocol, and on top of that an HTML rendererer. Conversely each layer provides data to the next layer, the HTML data is packaged inside an HTTP interaction, which is packed inside a TCP socket connection, which is packaged inside. The important thing is that each layer provides a computed result as if it was just data to the layer above. So when we say that we retrieve an html document from berlingske.dk we are actually computing an html document from a large number of IP packets we have received from the Berlingske server (well actually I received the packages from my ISP who in turn received them from ... (insert arbitrary number of links here) ... who retrieved the from a server at berlingske.dk).
This is of legal interest since you would generally consider the program 'active', i.e. the executor of the program is legally responsible for it's use or misuse, whereas the data is 'passive' i.e. just used. And the fact that you can move any specific bit of information from the active part to the passive part and vice versa is of course essential to the problems with digital technology and intellectual rights and legal responsibility. It should be clear from the above that my reading of pages at berlingske.dk involves a largish number of actions by many people. The actions span from writing the Windows TCP/IP stack to typing the words I end up reading to actually clicking the URL. There are many intermediaries (machines or people) that are responsible for assembling some of the meaning presented to me as data and/or software at various levels. Exactly which of these many intermediaries should be considered to play a direct part in my ability to access the information is almost impossible to say.
Clearly each of the actors involved have the ability to move responsibility around by repackaging what used to be 'passive' data to 'active' software. Newsboosters latest idea does exactly that. There are tons of other cases. I use an adblocker plugin for my browser, so when I look at berlingske.dk I don't have to wait for all the silly GIFs and - even more important - I don't have to look at them, which is the real annoyance of ads. Clearly this is automated use of data published on berlingske.dk in a way berlingske.dk did not originally endorse. But it should be equally clear that it is entirely legal. The use of published material is clearly not controllable by the web site providers and it is impossible to establish a boundary between what constitutes "the published work" and what constitutes "illegal derivation from the published work" when the work is made available in a machineable format and therefore can be decomposed through layers of automation/software.
So the 'no to deep links' position makes absolutely no sense, as long as I am able to run software on my own computer. Should the newspapers manage to get an injunction against the new 'active' link provision, the responsibility for finding the links can be moved to other places in the software.
What does make sense then and how do people get paid then? Clearly that is a problem that needs to be solved, but not in this heavy handed manner. I think that is the job for another post that may be considered 'in progress'.
A case has been presented to limit the possibility of extending copyrights perpetually - the most balanced account of the case seems to bethis one.
At Lawrence Lessigs website an account if made of how copyright holders by not exercising their exclusive marketing rights are effectively removing copyrighted works from history - since no one can get at them.
It seems clear - with a few well-run franchises (Elvis and Mickey) as the exception - that this does not really hinder the cultural evolution where the arts are concerned. A evolutionist view of the story would be that since copyright holders are so ineffective at deriving value from their memes other memes take over and dominate our cultural space, rendering the discussion insignificant. A Case in point would be the rise of Manga comics in the west. It satisfies curiosities the old memes fail to adapt to. Further witnesses to the insignificance of the lawsuit would be the short shelf-life of contemporary music, the short shelf-life of most movies etc.
Of course I am a little too much of an old-world character to actually think like that. We nostalgics do believe in historical value. But still it is a persuasive argument even if you like old culture.
In the case of technology and science time limits are essential to the extent that ownership of old ideas induces ownership of the new, but the pace of invention itself means that there are time limits on the economic viability of an idea, since it will be rendered meaningless by future ideas not yet discovered/created.
I've been looking for a decent XML text which defines XML concisely and without wasting a lot of space on API examples and the usual "Let's say you want to organize your collection of movies - here's how you might write a DTD to do that". The choice is very limited. Which is surprising since XML is so popular, but not so surprising when people just 'use the API' most of the time. Also the specs for most of XML are readable if not as readable as possible. There's the relevant Nutshell book which is quite good and then there's one from the developmentor people. Needless to say - since developmentor people have played such a role in SOAP - that has a chapter on SOAP which I really think is out of place since SOAP is a solution to a completely different problem than XML and does not have any of the scope XML has.
There's an XML pocket reference from O'Reilly also - but what I've seen of it is to talkative for there to be enough space in 100 pages to cover XML nicely.
The perfect text would be super short sections on all the important topics - and it doesn't exist AFAIK. An opportunity perhaps.
Power outages seems to be this winters theme on classy.dk - This time around my niece is not to blame. My street (or the section of it I live on) lost power stability for a couple of hours this morning and while that was happening. I took the servers down, since they were rebooting once a minute.
It is unlikely anyone noticed.
The plot of The Game - David Finchers third feature film - turns out to have a real counterpart almost as fantastic as the secret leisure organisation in the film. The players of Cthulhu Lives! stage highly realistic live-action games of tremendous dimensions in time and space. This is not like the Murder Mystery evening or weekend games that one can buy but lengthy, meticulously enacted excursions into the world of H.P. Lovecraft.
Det v?rste man kan forestille sig - hvis man altså er regensianer - er sket. En kraftig brand på Regensen har ?delagt loft og tag over f?rste og anden gang. Naturligvis skal den slags ske netop som bygningen er blevet renoveret og de gamle farlige el-installationer udskiftet med mere moderne overalt. Heldigvis kom ingen til skade.
Jon Udell has a nice piece on the power of combining simple things viaServices and Links. The absence of SOAP from the discussion is a quality in itself in this discussion on web service infrastructure. Pure publishing and open link space are the original selling points of the web and they remain what it does best.
As Udell shows, this does not mean that the web becomes non-machineable, even if the lack of typing of most of URL space may make the services fail a lot. But that's a feature not a bug for some kinds of information.
Working out from that link via this link there was sort of a webbed thread about the semantic web effort and how it relates to the current vogue of piggybacking infrastructure on top of the html web via clever parsing and abuse of semantic elements in HTML. My ability to read the thread appears to be derived directly from the two-way functions in MT that we don't use at classy.dk since the embarrasing lack of readers would be then be readily on display, so that's a nice indication of where URL space could go as a mechanism.
The conflict between heavy standards based strongly typed hard to deploy solutions and easy to do makeshift solutions is a classic and the many weblog formats clearly have been able to build tremendous momentum from good enough interfaces. The main threat to the semantic web efforts remains the need to build infrastructure before attaining value and the unclear value semantic web efforts provides to first mover implementers. And of course any successful application would have to expect the data to be broken a lot and still be valuable.
It's nice to blog - and I have fun doing it. But I really need to make this site a little uglier so all the typos that are here (since a browser is a terrible interface) will appear more natural and not just leavy the impression that my language skills suck (which they don't - honest)
This is good fun. A physicist computes the absolute boundaries on how much computation you could possible get out of matter. Quantum theory of course limits the information that can be contained in matter since there are non zero minimal energies to account for. It turns out that to apply Moore's Law on the doubling in capacity of integrated circuits for another 600 years would consume all available resources in the universe as part of the computation. Finiteness is a humbling thing.
Reading this book so close to reading The Cluetrain Manifesto was a mistake. There are too many similarities, and Weinberger even reworks one of the examples from Cluetrain into the new book.
Small Pieces is a nice enough book. It is very chatty and the points Weinberger makes about what human interaction on the internet is all about, while well made, aren't really as new or as unique as the reviews of the book had me believe. The biggest problem with the book is that it is not really about the technology that has enabled the new ways of communication, nor is it about the social mechanisms of people that makes this technology interesting. It is exactly an account of some cases where existing technology has afforded some new social situations and an account of some of the aspects of the impact of these particular situations.
That is of course of independent interest, but in not pointing out why we create the technology or why the technology has the power it does, the book does not really help you speculate about the next new mechanism we will see.
In that way the books seems like too little too late. Many of the observations made are readily available to any actor in the social situations described (like the painfully obvious lengthy discussions of the importance and nature of 'netiquette', i.e. the informal one-on-one communication form of newsgroups).
Furthermore the focus (cluetrain all over again) on the one-to-one feel of the individual networked connection in this book also overlooks the actual layout of the internet where the information hubs are really the bread and butter of information age life. With the size the internet has now, I have to say that the usefullness of the one-to-one features of the net are somewhat dominated by the brilliant hubs that are typically automated systems. But then again I'm a techie and that means of course that the automated systems serve up a lot of (inhuman) information of value to me.
As mentioned, I have just read Linked, which was very nice 'pop science' - even if the science bit was toned a little too far down for my tastes (the notes document everything properly though).
The exposition of network effects should be graspable to everybody and the importance in understanding the dynamics and feel of the networks around us is made very clear. Clearly there are some concepts here that deserve a simple mathematical write-up (as simple as possible) and some nice well known simulation tools to demonstrate the power and universality of these effects.
As mentioned in the discussion of Cluetrain below, the chapters on link density in scale-invariant networks and the notion of hubs is really quite an interesting eye-opener for us hopeless futurists, since it becomes clear that 'natural growth' neithers exhibits the robustness or the flat level playing field we would like. The world is just cruel like that. On the other hand, there are some nice discussion on what features of a network prevents monopoly by dominating some of the information econonmy monopoly generating effects (positive marginal returns in a frictionless information economy).
It's always funny to read optimistic statements that have since been proven a little too optimistic (I have a brilliant book from the fifties with the title 'Design for a Brain' - complete with electrical circuit diagrams) and The Cluetrain Manifesto which was of course a must read 2-3 years ago and not this year is an interesting case.
It's not that they are really wrong, it's just that the revolutionary tone really doesn't begin to describe how one feels about the present day internet with more and more intrusive spamming and less and less interesting new stuff.
But I shouldn't complain, and won't really. I prefer the 'brand new net' lovefest that Cluetrain tries to rekindle to the e-commerce babble one hears most of.
One thing is apparant from the enthusiatic descriptions of the one-to-one internet: The authors haven't fully taken in the 'power law' nature of the connected society described in Linked. One of the results of the power law structure of the internet is the notion of hubs - network hotspots that many network nodes find it useful to connect to. If you've ever been furtunate enough to functions as a local hotspot of some kind, you'd know that the gratifying experience of being useful is quickly replaced with a paralyzing sensation of not really being able to do much of anything except handle all of the requests streaming against you.
The dream of the one-to-one network where you can reach out and touch someone, and that someone can be someone who realy matters all the time is stifled because the hubs in interperson networks are very easy to saturate, i.e. paralyze intellectually by piling information requests on top of information requests. That's why there is such a thing as proper channels in a well-run organization. It is a device to liberate the connection points that all communication would run through to actually do something useful instead of just passing messages around.
There's an interesting equilibrium mechanism to this though, which could deserve to be properly mathematically modeled. Say you have a network, where the value of a node is the amount of information it is able to produce disseminate throughout an organization, and introduce prices for information production as well as for transmission.
Let every node try to optimize. What kind of network does those rules breed. Suppose the nodes are allowed to price their information production as well as their information dissemination as part of their own optimisation. What is optimal behaviour and what kind of network does the optimal linking behaviour afford.
The final conclusion is that the manifesto itself is a lot more interesting than the book about it.