Fascinating gambit; I usually associate it with more mature markets. Perhaps this is a sign that SOAP skills are going to be more profitable for enterprise developers to add to their portfolios soon…
It’s also an interesting judging panel — several of these folks have definitely been around for several hype cycles, so it says something that they’re onboard with this bet on WS’ maturity.
“The Golden Spike,” Grand Central’s first annual contest for developers. The contest starts October 18th in conjunction with the Early Access Program and continues to December 10, 2004, with the winners to be announced in January 2005.
The Golden Spike contest provides developers with an opportunity to showcase their innovative work around reusable business processes and Web services development. Using industry-leading tools and resources provided by Grand Central, participants can submit one or more entries in the following categories:
* Best Business Process
* Best Use of SOAP APIs
* Best Use of Rich Client
The grand prize winner will receive a dream workstation of his or her own creation worth up to $10,000, and there will be three first place winners for each of the categories taking home a $1,000 prize. Contest entries will be judged by a panel of Web services and SOA experts including Tim O’Reilly, founder and chief executive officer of O’Reilly Media; Jason Bloomberg and Ron Schmelzer, senior analysts with industry research firm ZapThink; Bill Appleton, founder, president and chief scientist of DreamFactory; Phil Windley, contributing editor for InfoWorld Test Center; Tony Hong, co-founder of XMethods; Phil Wainewright, chief executive officer, Procullux Ventures, and publisher of Loosely Coupled; and Halsey Minor, chief executive officer, chairman and founder of Grand Central Communications.
Diversity of opinion – That’s a no-brainer. Bloggers publish hundreds of thousands of posts daily, each one charged with its author’s unique opinion.
Independence of members – Except for your friends saying “You’ve got to blog about that!” bloggers are not controlled by anyone else.
Decentralization – There is no central authority in the blogosphere; publish your blog anywhere you want with any tool you want.
A method for aggregating opinions – Blog feeds make aggregation a snap and there is no shortage of services that take advantage of that fact.
The article goes on to talk about how MIT Media Lab project Blogdex (one of the longest-operating and most-visited opinion aggregators) is like a hive mind of the blogosphere, collectively creating a modern Oracle with no single opinion about anything. Excellent.
Cachelogic Research paints an interesting picture of decentralized filesharing.
The most astonishing item is that global Internet traffic analysis in June 2004 revealed that in the United States peer-to-peer represents roughly two-thirds of traffic volumes, and in Asia peer-to-peer represents more than four-fifths of traffic volumes. By comparison, HTTP is less than a tenth of the traffic in Asia and less than a sixth of the traffic in the United States. CacheLogic calls peer-to-peer the killer application for broadband with a global reach and a global user base.
Perusing the architectures and protocols section of CacheLogic’s site we find a table comparing the characteristics of web traffic (HTTP) with those of common peer-to-peer protocols. They point out that first generation p2p systems were centralized like Napster; second generation p2p systems were decentralized like Gnutella; and now
The third generation architecture is a hybrid of the first two, combining the efficiency and resilience of a centralized network with the stealth characteristics of distributed/decentralised network. This hybrid architecture deploys a hierarchical structure by establishing a backbone network of SuperNodes (or UltraPeers) that take on the characteristics of a central index server. When a client logs on to the network, it makes a direct connection to a single SuperNode which gathers and stores information about peer and content available for sharing.
BitTorrent’s dominance is likely to be attributed to two factors: the rise in popularity of downloading television programmes, movies and software; and the size of these files – a MP3 maybe 3-5Mb while a BitTorrent often sees files in excess of 500Mb being shared across the Peer-to-Peer network.
The high usage of eDonkey in Europe can be attributed to the fact that the eDonkey interface is available in a number of different languages – French, German, Spanish, etc.
So even though the hype machine has stopped pumping p2p, the quieter revolution of the last few years has shown that peer-to-peer traffic has steadily grown to a majority of the Internet traffic worldwide.
Joi Ito points out that Wikipedia just passed one million articles: “Wikipedia is in more than 100 languages with 14 currently having over 10,000 articles… At the current rate of growth, Wikipedia will double in size again by next spring.” Wikipedia itself points to the power of a massive, decentralized content authoring effort.
Ross Mayfield adds, “To put this in perspective, if each article took 1 person week to produce, getting the next million would take 40,000 full-time equivalent resources to get it done in the same amount of predicted time. Co-incidentally Wikipedia has about the same amount of registered users, but they have day jobs too.”
Even more impressive, “Wikipedia is a volunteer effort supported by the non-profit Wikimedia Foundation.” When I clicked on their fundraising effort, I discovered that they’re looking to raise fifty thousand dollars — a tiny amount by corporate standards. It speaks to the fact that a massive, decentralized effort need not cost a tremendous amount to have a huge impact.
https://commerce.net/mindystaging/wp-content/uploads/2021/09/commercenet-logo-1.png00amshttps://commerce.net/mindystaging/wp-content/uploads/2021/09/commercenet-logo-1.pngams2004-09-24 04:20:002004-09-24 04:20:00Wikipedia Turns a Million
Laura Landro’s 9/22/2004 Wall Street Journal article, “Electronic Medical Records Are Taking Root Locally” (available to WSJ subscribers) talks about how
More than 100 state and local groups are moving quickly to establish their own networks in which various health-care providers can securely share patient information, aiming to cut down on medical errors and duplicated efforts…
The regional networks aim to get local providers to convert patients’ paper medical files to electronic records, and persuade doctors to exchange pertinent information with a patient’s other health-care providers. By using a single network, regional health groups say they can reduce medical mistakes, better track patients with chronic diseases such as diabetes, zip prescriptions electronically to pharmacies, and cut costs by eliminating duplicated lab tests and X-rays…
With no money or federal authority to mandate a national health-care network, regional networks are also emerging as the only solution to wiring up the country’s medical system. Creating a nationwide system for sharing medical records would cost billions of dollars, scaring off many legislators… because the U.S. has a highly fragmented private health-care system, ‘starting from the bottom and working up is the only viable approach,’ says Lewis Redd, who runs the health-care consulting practice for Capgemini.
The federal government’s role, he says, is to push for widespread adoption of a single technical standard that will let all the different medical records in the country eventually talk to each other and share data, all the while allowing access only to authorized users, to ensure privacy. Such technical standards already exist, and David Brailer, the U.S.’s health-information-technology czar, is in the process of deciding how best to endorse them and provide guidelines for their use.
As evidence mounts that easily-transferable electronic medical records reduce costs and errors, these grassroots regional efforts will build momentum.
https://commerce.net/mindystaging/wp-content/uploads/2021/09/commercenet-logo-1.png00amshttps://commerce.net/mindystaging/wp-content/uploads/2021/09/commercenet-logo-1.pngams2004-09-23 16:56:052004-09-23 16:56:05Electronic Medical Records Are Taking Root Locally
‘Clean Money’ with SOAP?
Event Driven ArchitecturesFascinating gambit; I usually associate it with more mature markets. Perhaps this is a sign that SOAP skills are going to be more profitable for enterprise developers to add to their portfolios soon…
It’s also an interesting judging panel — several of these folks have definitely been around for several hype cycles, so it says something that they’re onboard with this bet on WS’ maturity.
Grand Central Communications Unveils New Developer Program
“The Golden Spike,” Grand Central’s first annual contest for developers. The contest starts October 18th in conjunction with the Early Access Program and continues to December 10, 2004, with the winners to be announced in January 2005.
The Golden Spike contest provides developers with an opportunity to showcase their innovative work around reusable business processes and Web services development. Using industry-leading tools and resources provided by Grand Central, participants can submit one or more entries in the following categories:
* Best Business Process
* Best Use of SOAP APIs
* Best Use of Rich Client
The grand prize winner will receive a dream workstation of his or her own creation worth up to $10,000, and there will be three first place winners for each of the categories taking home a $1,000 prize. Contest entries will be judged by a panel of Web services and SOA experts including Tim O’Reilly, founder and chief executive officer of O’Reilly Media; Jason Bloomberg and Ron Schmelzer, senior analysts with industry research firm ZapThink; Bill Appleton, founder, president and chief scientist of DreamFactory; Phil Windley, contributing editor for InfoWorld Test Center; Tony Hong, co-founder of XMethods; Phil Wainewright, chief executive officer, Procullux Ventures, and publisher of Loosely Coupled; and Halsey Minor, chief executive officer, chairman and founder of Grand Central Communications.
Blogs Are Decentralization Incarnate
DecentralizationFrom Biz Stone’s excellent article, The Wisdom of Blogs:
The article goes on to talk about how MIT Media Lab project Blogdex (one of the longest-operating and most-visited opinion aggregators) is like a hive mind of the blogosphere, collectively creating a modern Oracle with no single opinion about anything. Excellent.
Decentralized Filesharing Is Huge
DecentralizationCachelogic Research paints an interesting picture of decentralized filesharing.
The most astonishing item is that global Internet traffic analysis in June 2004 revealed that in the United States peer-to-peer represents roughly two-thirds of traffic volumes, and in Asia peer-to-peer represents more than four-fifths of traffic volumes. By comparison, HTTP is less than a tenth of the traffic in Asia and less than a sixth of the traffic in the United States. CacheLogic calls peer-to-peer the killer application for broadband with a global reach and a global user base.
Perusing the architectures and protocols section of CacheLogic’s site we find a table comparing the characteristics of web traffic (HTTP) with those of common peer-to-peer protocols. They point out that first generation p2p systems were centralized like Napster; second generation p2p systems were decentralized like Gnutella; and now
Recent developments in peer-to-peer include dynamic port selection and bidirectional streaming of download traffic in the most popular peer-to-peer applications in 2004, BitTorrent (more useful thanks to many available BitTorrent clients and DV Guide) and eDonkey (and eMule). BitTorrent is by traffic the most popular peer-to-peer application:
So even though the hype machine has stopped pumping p2p, the quieter revolution of the last few years has shown that peer-to-peer traffic has steadily grown to a majority of the Internet traffic worldwide.
Wikipedia Turns a Million
DecentralizationJoi Ito points out that Wikipedia just passed one million articles: “Wikipedia is in more than 100 languages with 14 currently having over 10,000 articles… At the current rate of growth, Wikipedia will double in size again by next spring.” Wikipedia itself points to the power of a massive, decentralized content authoring effort.
Ross Mayfield adds, “To put this in perspective, if each article took 1 person week to produce, getting the next million would take 40,000 full-time equivalent resources to get it done in the same amount of predicted time. Co-incidentally Wikipedia has about the same amount of registered users, but they have day jobs too.”
Even more impressive, “Wikipedia is a volunteer effort supported by the non-profit Wikimedia Foundation.” When I clicked on their fundraising effort, I discovered that they’re looking to raise fifty thousand dollars — a tiny amount by corporate standards. It speaks to the fact that a massive, decentralized effort need not cost a tremendous amount to have a huge impact.
Electronic Medical Records Are Taking Root Locally
Health CareLaura Landro’s 9/22/2004 Wall Street Journal article, “Electronic Medical Records Are Taking Root Locally” (available to WSJ subscribers) talks about how
As evidence mounts that easily-transferable electronic medical records reduce costs and errors, these grassroots regional efforts will build momentum.