I’ve believed for a long time that the Internet is going to transform the structure of society, like the television, telephone, and printing press did before it. Like its predecessors, it offers new ways to acquire and present information and entertainment, new ways of communicating ideas and spreading culture. But the Internet has a unique trait that its predecessors did not: it’s inherently peer-to-peer.
What do I mean by that? Simple: the Internet is a dumb network linking together many smart peer terminals. Compare this with the telephone and television networks. The telephone network has peer terminals, but they’re dumb peer terminals. Each phone can only send and receive voice information. It can’t perform the same kinds of complex calculations your computer can, and all the “smarts” are in the network. The television network’s even worse – it’s not even a network of peer terminals. There’s receiver terminals (namely, your TV) and transmitter terminals (the TV stations). No matter how much you need to, your TV can’t serve as a broadcaster.
The Internet’s different. Theoretically, at least, everyone on the network is equal. The only real thing that separates the laptop I’m writing this on (or the computer you’re reading it on) from Booman’s server is what software it’s running and how reliable its connection is. If I wanted to, I could host something like Booman Tribune on my laptop. It wouldn’t be a very smart move, since it’d go offline whenever I closed the lid or walked out of WiFi range, but I could do it.
Despite attempts by various large corporate entities (notably, telecoms) to curtail it, this mentality’s spread to basically every corner of the Internet. The technologies that succeed are the ones that embrace the “end to end” or “peer to peer” principle. Sooner or later, those that don’t run up against a competitor that does, and they die. The peer-to-peer model’s even affecting software development. Instead of a few large companies writing all of our software, increasingly large amounts are being made by communities of developers working together and freely exchanging ideas. And making money off it. (Though it’s interesting to note that the Free Software movement precedes anything resembling the modern Internet by a good two years. Richard Stallman is a true visionary, and possibly the greatest genius of the 20th century.)
More recently, we’ve seen a drastic rise in user-created content sharing tools and social networking. Blogs, YouTube, Facebook, MySpace… All embody the end-to-end principle.
Ranjit Mathoda starts talking about how all this applies to politics, specifically Barack Obama’s campaign. It’s an amazing, inspiring read. From the look of things, Obama’s campaign has embraced the end-to-end principle in a way that even Howard Dean’s 2004 primary campaign didn’t. Why’s his ground game so good? It’s not because he’s got talented organizers working for him. It’s because his web site gives his supporters the tools they need to be self-organizing. It connects his supporters to each other without central organization or supervision from the campaign.
The one thing that stands out as disagreeable are Mathoda’s claims about message dissemination. That’s Oldtype thinking. The Obama campaign doesn’t need to exploit some emotional connection to a social networking site to pass on talking points, because it’s transcended talking points. Just as Kennedy transcended the techniques used to dominate the message in newspapers and on news radio, Obama’s transcended the techniques perfected by Karl Rove for dominating the message on television and talk radio. He is, as Mathoda correctly points out later, a “nonhierarchical collaborative leader inspiring autonomous individuals to cooperate for the sake of common concerns”. Why bother with talking points and centrally-organized rallies when he can seize on a matter of common passion between himself and his supporters – poverty, war, social justice, whatever – and give them the tools to promote it and take action on it themselves?
Reading further, I don’t know why he even bothered to include that bogus statement on talking points, since he correctly identifies the strength of this distributed intelligence a few paragraphs later when he begins talking about Obama’s plans for citizen involvement in the process of government. That section, in particular, is incredibly inspiring, and I strongly encourage everyone to read it as many times as it takes to sink in.
But what really impresses me? The impetus for all this distributed thinking seems to be Obama himself. This wasn’t cooked up by some committee of high-paid consultants in a back room, or concocted based on documented known good practices. This was created because Obama looked at the Internet and saw the same things as Richard Stallman. This is why the claims about Obama not having a plan are bogus. He’s not going into office with a meticulously detailed four-year plan for fixing all of America’s ills. He’s much more audacious and ambitious than that. He’s going into office with a plan to restructure America and make it fix itself, to make government smarter, more adaptable, and more responsive. He’s going in knowing he knows nothing, and prepared to harness the power of an entire country to learn, and evaluate, and decide as best he can.
This is why Obama is the only candidate who can fix the problems America faces today. This is why he’s come from a laughable long-shot to the inarguable victor. And I think those that write him off as business as usual, or condemn him as a “corporate” candidate, are going to be pleasantly surprised. Obama’s not winning because the system supports him. He’s winning because he’s beating the system at its own game.
The Newtypes are taking over. If we don’t win this cycle, we’re going to win the next one, or the one after that. We’re improving faster than our competition. Just look at the quantum leap from Dean to Obama. Things are about to get very interesting…
.
The present world consumption of electricity will be needed to provide the power for all Internet, switching and data centers by 2030. Scientists (pdf) recently estimated that web sites and data servers consumed over 7 terawatt-hours (TWh) of electricity in the U.S. alone.
In the U.S., data server energy consumption grew 14% per year between 2000 and 2005. At this rate the United States alone will need nearly a dozen new power plants by 2011 (Source: EPA)
Rising electricity costs (pdf file)
"But I will not let myself be reduced to silence."
I’d like to point out that that’s completely baseless. Go read the actual paper instead of just the summary: the 8% figure comes from George W. Bush, while the 7 TWh figure is unsourced. So let’s look at an actual quote that’s not coming from a reactionary, authoritarian technophobe:
That’s from the first PDF you link. So, first off, this is an estimate. No-one has solid figures, and this study hasn’t even released their methodology. If they had, it would have been cited. Secondly, this is “Web sites and other servers”, “data centers”, and “data servers”. What exactly does that include? Well, first off, it doesn’t include switching, IE, the bits that make the Internet go.
Now, what are “Web sites and other servers”, “data centers”, and “data servers”, exactly? How does that group actually break down? It’s been a while since I actually looked at the studies in detail, but I’ll tell you right off the bat: if the terms used here are accurate, what you and most other people think of as “the Internet” is a negligible fraction of this total consumption. Most of that consumption is coming from operation of “data centers” and “data servers” that are used by the very type of large, centralized organizations I condemn. Most of these corporate data centers aren’t part of the “public Internet” at all, but are private systems used by large corporations (particularly multinationals) to conduct business. That’s why “other servers” and “data centers” are mentioned separately from “web servers”.
The exceptions? Large scientific processing clusters, MMORPGs (basically just World of Warcraft at this point) and search engines (at this point, that means Google and Yahoo). Scientific processing clusters and online games are negligible compared to corporate data centers; there simply aren’t that many of them. Google, at least, is sinking large amounts of funding into ensuring that its data centers are powered almost entirely by renewable energy.
Even going beyond that, one needs to compare efficiency, load, and benefit, not just toss out imaginary out power consumption numbers and pretend they’re relevant and condemning. I’d bet good money that the Internet’s one of the most efficient things on our balance sheet right now, and your time and energy would be much better spent decrying inefficient transportation, interior illumination, climate control, building codes, factory farm agriculture, or defense spending. Not everything that consumes electricity is inherently evil.
Estimates also vary widely. As documented by Saul Griffith, they go as low as 0.5% of the total US energy use. In his extremely generous break-down of his personal materialism, his portion of the Internet’s cost is hardly worth mentioning, less than his combined vegetable and wine consumption, and barely more than his share of the US military and nuclear arsenal cost.
So, are you actually going to reply to this and engage in discussion, or are you going to keep rudely tossing out misleading links and summary fragments without actually contributing anything of value?
.
I read a news item in Europe about this topic. Unfortunately there was no English translation. Fraunhofer Institut has calculated the power consuption for Germany of the IT industry will be 25 Terawatt by 2010. At the CEBIT, much was covered in reducing power consumption of PC’s and infrastructure for the Internet and computer networks. Major reference was given to the Climate Savers Computing Initiative.
According to Prof. Gerhard Fettweis of TU Dresden, the IT industry will consume as much electricity in 2030 as the world consumes at this moment.
Most likely the figures are raw projections of today’s technology and growth forecast, and therefore an exaggeration. It may serve as purpose to get the consumer and business community focused on in-house change in attitude towards power management and conservation.
● Estimating total power consumption by servers in the U.S. and the world
● Energy at the Crossroads: Global Perspectives and Uncertainties
"But I will not let myself be reduced to silence."
Right, but this is one area where a market does (mostly) work. Because the big consumers are all (or were all, last I checked) big corporate data centers, or at least, big corporate data centers and co-lo hosting companies, they’ve got the financial clout needed to press for research into more efficient systems. And, in fact, that research is already happening… The products are just getting implemented in laptops first, because they’re high-margin, high-visibility, and extremely power-sensitive. Laptop performance is now pretty close to on par with desktop performance, so I think we’re going to see a lot of technology migrating there, particularly as electricity prices rise.
As for the projection figures, again, absolute figures aren’t particularly useful. Equally important are how these stack up against other projections and what assumptions they’re made under. This is particularly important when dealing with IT, because a lot of projections are made based on the assumption of unlimited hyperbolic developments (IE, the Technological Singularity) which are not often borne out in practice. Such things more often resemble “S-curves” (I forget the technical name. Yes, I suck), turning up sharply for a period, then levelling off. The Internet’s been growing rapidly because it’s a new, transformational technology. I bet if you looked at a similar graph of growth for telephone or television networks, you’d see a similar initial spike followed some time later by a levelling or even a decline.
There’s also the matter of relative energy use compared to alternatives. Say Obama does manage to move the US government to a more-or-less completely electronic, publicly accessible model. Is this going to consume more or less energy (not just electricity) than the existing (presumably paper-heavy) model? My money’s on less, but I could be wrong.
Also, I’d be curious to know how much of that cost is the cost of actually running the servers, and how much is the cost of air conditioning and otherwise controlling the environment in the massive sever rooms typically used by these data centers. Large clusters of computers have a major disadvantage there compared to smaller, distributed clusters. Because of the density, they’re very sensitive to their environment (particularly disposing of the heat they generate), and so it takes a lot of energy to keep their environment appropriately cool and sterile. If that’s a large percentage, it might be possible to significantly reduce it just by moving away from this obsolete centralized model!