Programmers are trying to imbue Web pages with intelligence.
And work is underway to re-engineer the network to reduce spam and security troubles.
Stephen Crocker and Vinton Cerf were among the graduate students who joined UCLA professor Len Kleinrock in an engineering lab on Sept. 2, 1969, as bits of meaningless test data flowed silently between the two computers. By January, three other “nodes” joined the fledgling network.
Then came e-mail a few years later, a core communications protocol called TCP/IP in the late 70s, the domain name system in the 80s and the World Wide Web — now the second most popular application behind e-mail — in 1990.
Today, Crocker continues work on the Internet, designing better tools for collaboration. And as security chairman for the Internet’s key oversight body, he is trying to defend the core addressing system from outside threats, including an attempt last year by a private search engine to grab Web surfers who mistype addresses.
Network providers now make only “best efforts” at delivering data packets, and Crocker said better guarantees are needed to prevent the skips and stutters now common with video.
Working with NASA, Cerf is also trying to extend the network into outer space to better communicate with spacecraft. But many features being developed today wouldn’t have been possible at birth given the slower computing speeds and narrower Internet pipes, or bandwidth, Cerf said.
While engineers tinker with the Internet’s core framework, some university researchers looking for more speed are developing separate systems that parallel the Internet. Think information highway with an express lane.
Semantic Web is a next-generation Web designed to make more kinds of data easier for computers to locate and process.
http://networks.org/?src=ap:internets-birthday