Portugal’s 500K Classmate PC order a nail in OLPC coffin

The government of Portugal has announced plans to launch a new education technology program called the Magellan Initiative, which aims to bring low-cost mobile computers to half a million young students. The laptops, which are being developed in collaboration with Intel, will be based on the company's Classmate PC reference design. HangZhou Night Net

"This new collaboration with Intel underscores Portugal's commitment to advance quickly toward a knowledge-based economy," said Portugal's Prime Minister, Jose Socrates. "By equipping our schools with state-of-the-art computing technology and Internet connectivity, we hope to hasten the transition to economic models that benefit our citizens."

The deal is a major victory for Intel's Classmate PC concept which has been battling for mindshare against the One Laptop Per Child (OLPC) group, a nonprofit organization that emerged from the MIT media labs. OLPC has been afflicted with numerous setbacks and is presently mired in internal disputes that raise serious questions about the project's long-term viability. The Classmate PC, which doesn't suffer from OLPC's lack of proper distribution infrastructure or extreme dependence on scale, is beginning to look like a more viable long-term solution. OLPC has secured only 600,000 total orders worldwide.

Although the Classmate PC uses much less innovative hardware than OLPC's unique XO, the Classmate's more conventional approach and strong support for Microsoft's Windows operating system have given it a bit of an edge. Governments have been reluctant to commit to bulk orders of OLPC's largely unproven product. OLPC has attempted to combat this by offering XP, a move that has alienated OLPC's open source supporters.

Intel and OLPC have had a strange relationship, alternating between mutual condemnation and vows of partnership. OLPC insisted that low-cost education laptop efforts should be consolidated and centralized so that economies of scale could be optimally leveraged. To this end, OLPC insisted that Intel should drop its Classmate PC project. Intel argued that having multiple options, suppliers, and distribution channels was a better and more sustainable approach. When a compromise couldn't be reached, the tenuous relationship was abandoned.

Various systems based on the Classmate reference design are already being sold in several countries. The existing models range from about $250 to $500 in price and include Intel Atom processors and 1GB of RAM. The precise configuration and cost-per-unit for the Portuguese model has not yet been disclosed. Intel is said to be working on an update to the hardware design, the details of which are expected to emerge next month.

Intel is also helping Portugal launch a new web-based interactive learning system called Intel skoool that is designed to provide resources for teaching math and science (but obviously not spelling). The web site was launched last November and continues to grow.

This looks like another nail in the coffin for the OLPC project. Governments around the world will likely be watching the progress of Portugal's Classmate rollout and it could answer a lot of questions about the efficacy of Intel's approach to low-cost education computers.

Being a better gamer: a guide to changing the world

Truth, Lies, and Video Games

Apathy is for losers. Fine, you
don't have time to start a charity or fight Jack [Thompson], then let your work do
the talking. Your collective creative output is the real ambassador
that touches millions on a global basis. Games have the ability to
transform the world. Don’t lose sight of that. You create culture. We
ARE culture.HangZhou Night Net

Like it or not, you are all already ambassadors for games. So, better make the most of that responsibility! Award or not, I can’t ever do that for you.

These words were spoken by Jason Della Rocca as he won the Ambassador Award at the Game Developers Conference this year. Della Rocca has done wonderful work for the International Game Developers Association, but his speech was inspiring: it's everyone's job to put a better face on gaming, because whether you're a hardcore shooter fan or you simply enjoy a few rounds of Peggle, you're taking part in an industry that many don't understand, and some would like to control. We've recently had the pleasure of speaking to some of the best minds currently working to help the gaming industry grow up, and to some very giving people who seek to put a human face on the suffering of many and alleviate some of that pain. Their instruments? Video games?

The video industry is growing up. Luckily for us, many gamers are growing up with it, and we have many capable hands guiding the art form and protecting it from the misinformation about what gaming is, and the type of people who enjoy it.

The battle won't be won in the courts

If the mainstream media is to be believed, video games are about as healthy as cigarettes, except they cause psychotic breaks instead of cancer. Politicians use this artificially created fear of games as an excuse to pass legislation to criminalize the sale of certain games, although so far, each of these pieces of legislation has been successfully challenged and overturned in the courts. While more people than ever are playing games, and the audience for those games is expanding to include people of both sexes and all ages, the media's reporting on the industry and its effects on children remains reactionary and woefully misleading.

Being a good person and a gamer is easier than you think.

One of the biggest misconceptions about games is that they're the domain of adolescent boys. According to the Entertainment Software Association, the average age of gamers is now 35. Approximately 60 percent of gamers play video games with their friends, 40 percent of gamers in the US are women, and almost a quarter of all video game players are over the age of fifty. If video games are so popular and widely enjoyed, why are they still vilified in the media?

Who actually plays video games? The ESA's findings might surprise you.

The full answer to that question is complicated, but the short-and-simple version is that, all survey data aside, video games are still a new form of media that older generations by-and-large rarely see or experience first-hand. Human nature is simple: we fear what we don't understand.

"[The demographics for the ESA survey] would mean that for every 16-year-old kid there's a 50-year-old guy to average out to 33, and that’s just not what I see when I go to GameStop or when I read comments online," said Dennis McCauley of GamePolitics when he talked with Ars. "I think gaming is still largely a youth culture thing, at least in the sense of the most involved gamers." McCauley's logic makes sense: enjoying games like TextTwist, MineSweeper, and Solitaire might make older players more amenable towards video games in general, but that doesn't mean they fully understand the nature of the industry. Will Wright gave a speech where he pointed out that games aren't the first new trend to be distrusted. He told us about how he came across an article where the writer described a young man engrossed in some unfamiliar handheld object, apparently to the point where he was out of touch with his surroundings and didn't want to interact with others because it distracted him from his new obsession.

It turns out this account came from a medieval monastery; the object in question was a book in the hands of a young monk. Everything, when it's new to society, has to prove its worth.

Games are still perceived by many as toys, and toys are meant to be played with by children. But the idea of little tykes playing with mature content is more than enough to get concerned parents up in arms. "I think [this attitude is] due in part to the misperception that video games are primarily intended for kids," said Patricia Vance of the ESRB. "When you juxtapose that mis-perception with the presence of mature content in a video game, it causes concern. The more games are thought of in the same way as movies and TV shows, the more acceptance they’ll gain, and that’s been happening more and more in recent years."

Some of the biggest opponents of games in recent political history (clockwise from top left): Hillary Clinton, Elliot Splitzer, Andrew J Lanza, and Leland Yee.

These two ideas about games are at odds: one group sees them as toys with inappropriate content, and the other sees an emerging art form that is giving movies and music a run for their money, quite literally. Now that games have managed to infiltrate the popular culture at large, the politicians are becoming even more alarmed, and the expensive campaign to pass laws controlling the content of games and their sales may find new allies.

Still, some in the industry are hopeful, and they think these attempts will trend downward. "[The political climate]'s still dicey, but a little better than it has been. The good news is that the number of serious legislative attempts to regulate games at the state level has dropped off over the last two years," said McCauley. "Right now, only Massachusetts is pushing legislation, and that bill seems stalled. The California law is still under appeal by Gov. Schwarzenegger, and we're waiting to see if Minnesota decides to take its bill to the US Supreme Court. They have exhausted their other avenues of appeal. New York had a bill that looked to be on the fast track to passage until Gov. Spitzer got caught going the GTA hooker route." Since McCauley talked to Ars, the New York bill has been signed into law by Gov. David Paterson (Spitzer's replacement), and the industry is potentially going to challenge it in court.

Politicians have made a point of trying to limit the sales and content of games they find morally objectionable, but the industry has managed to legally overturn every piece of legislation with an undefeated record of 9-0. This intimidating track record might not stall every attempt, but it sends a powerful message. The fact that gamers are making up an increasing percentage of voters might have a little something to do with it, too. At least part of this recognition is due to groups like the Entertainment Consumers Association, the Entertainment Software Association, and the ESRB making an effort to educate both the general populace and elected officials about the reality of video games. The gamers themselves have stepped up, and have become their own PR machine, showing the world that our favorite hobby is enjoyed by good, upstanding people.

First responders to FCC: give up national D Block pipe dream

They packed Brooklyn, New York's elegant Borough Hall yesterday—police and fire department officials, telco lawyers, and a former state Attorney General—lining up to tell all five members of the Federal Communications Commission how it should set up a national broadband public safety communications system. But before they spoke, FCC Commissioner Michael Copps laid out the truth: years into the process, the agency still isn't even sure what to do. HangZhou Night Net

"What purpose should this network serve?" Copps asked in his opening statement at the en banc hearing. "Video? Data? A backbone that connects existing voice networks? Or a whole new, built-from-scratch network that can do it all?"

How much will it cost? Copps wondered. And most perplexing: "Should licenses be regional or national?"

Actually, many panelists at the hearing had an answer to that last question. Drop the vain attempt to auction off the so-called 700MHz "D Block" to one national licensee, they told the FCC. Instead, allow public safety agencies to access the spectrum and build their systems on a regional basis instead.

"There is nothing wrong, per se, with the goal of nationwide interoperability," argued James Farmer, former counsel to the 9/11 Commission as well as former New Jersey Attorney General. "The way that emergencies actually happen, however, suggests that the best way to achieve interoperability nationwide is by first building it locally and regionally, and then interconnecting the regional interoperable networks."

The head of New York City's Communications Division put it more plainly. "The Commission should not re-auction the 'D' Block spectrum," Charles Dowd said, "and should appeal to Congress to permit the allocation of this spectrum immediately to public safety."

The position that the government should authorize a broadband based public safety communications network built, "from the bottom up," as Farmer put, comes in the wake of the single failed portion of the FCC's recent 700MHz auction, concluded in mid-March. That massive sale of now defunct analog TV channels raked in over 19 billion dollars, except for the spectrum reserved for public safety. A successful bidder on the D block's two hunks of bandwidth would have leased them for commercial purposes from a public safety administration agency. But nobody bid the FCC's minimal $1.3 billion asking price for the block.

A subsequent FCC audit found that potential buyers were scared to fork over the money, fearful that the public safety burdens of the public/private partnership would leave them with many obligations and too little revenue.

Chop it up…

So in May the FCC launched a new proceeding on what to do about the D Block. Since then it has gotten an earful from cities across the country who plead that they can't wait for the Commission to figure this one out. Chop that block into regions, they say, and let us build our own networks in partnership with commercial vendors. American cities have demonstrated that they can do the job "without the need for a public-private nationwide network," Paul Crosgave, New York's Director of Information Technology told the hearing.

Not surprisingly, the potential commercial partners for these proposed regional networks appeared the FCC hearing, showing great enthusiasm for the idea. AT&T's marketing development director Stacey Black recommended that instead of auctioning off the D Block on a national basis, give it over to the designated public safety administrator and let it develop national interoperability guidelines. Cities and regional safety planners would submit RFPs (Request for Proposals) that conform to the specs, then build their own systems.

…or keep it whole?

But there are more than a couple of problems with this line of thought, defenders of the national D Block plan noted. Robert Gurss of the Association of Public Safety Communications Officials-International countered that, unlike New York City, many suburban and rural parts of the United States don't the have resources to build their own regional public safety communications infrastructures.

And some commenters worried whether a regionally-based plan can be unified without strong national oversight. "From my perspective, if you don't have some kind of leadership, some kind of national governance" to make even a cross-regional, inter-related system happen, "it will never happen," warned Harlin McEwen, Chair of the Public Safety Spectrum Trust Corporation, which will administer the public safety aspect of the D Block. "History has told us that we have lots of incompatible local systems that get built out with local requirements without any national consideration."

Queries about funding a regional scheme repeatedly popped up during the hearing, with a sensible observation coming from William J. Andrle, Jr. of Northrop Grumman Information Technology. "It does bear mentioning again that this exercise might be unnecessary if there was at least some direct funding from the federal government," Andrle stated during his testimony. "We have federal matching funds for highways, the environment, and other needs. Why nothing—zero —to facilitate the establishment of interoperable public safety broadband services that will enable first responders to better protect the public?"

Can the FCC change course?

Then there's the question of whether the FCC even has the legal right to drop its D Block plan and adopt a regional approach. A skeptical Commissioner Robert M. McDowell asked regional plan supporters AT&T and Verizon whether they thought the agency could legally make the switch.

"Did I hear AT&T and Verizon correctly?" McDowell said. "You don't know if we've got the statutory authority."

"I do not know, Commissioner," replied Verizon Vice President Don Brittingham.

"Yet you're making this proposal, without knowing… " an obviously bemused McDowell continued.

"Let me be more clear," Brittingham explained. "As a lawyer I don't know technically whether you have authority or not. But even if you don't currently what Verizon is saying is that in order to solve this problem in the best interest of public safety, Congress should step in and do that. There should be legislation to give you the authority to reallocate the spectrum."

Will Congress step in soon? McDowell wondered. AT&T's Black offered a few hopeful words. Larry Krevor, Vice President of Sprint-Nextel, was a little more candid with the Commission. "I think at this point you have to auction the spectrum," he advised. The FCC would have to go back to Congress to set up a different plan. "And certainly with the US about to have an election and a new Congress and new Committee assignments, etcetera, the Hill will probably not move as quickly as we might like."

Towards the end of the hearing, McDowell promised that the Commission will make a decision on the D Block, but also confided his frustration. "The bottom line is that there is no bottom line right now," he said. "There is no critical mass behind a particular set of concepts other than something needs to be done. It's hard to find agreement. So again, I will recommend a strong prayer for the FCC."

Further reading:

Testimony of the panelists at the FCC's hearing on the D Block

IE8 Beta 2 getting heavy performance, crash-recovery tweaks

More details about Microsoft's next version of its ailing browser have been released, in the build-up to the second beta release due next month. The first beta, released in March, was aimed at web developers. It brought much-needed improvements to standards compliance, along with negligible reliability and inconsistent performance. HangZhou Night Net

Beta 2 is aimed at a general audience; not just web developers who need early access to IE8 to find out what breaks (and what finally works), but a broader audience including IT staff evaluating the next browser version so they know whether to deploy it as well as end-users who just have to run the latest version of everything even if it isn't quite finished yet. As well as the all-important standards compliance, IE8 brings a raft of new security, reliability, and management features. The official IE blog has described some of these already, and on Monday gave more details about what to expect in beta 2.

With IE8, Microsoft is attempting to solve one of the most annoying problems with today's multi-window, multi-tab browsers; namely, the disastrous effect that a browser crash has. It is an unfortunate feature of most browsers that a crash in one tab takes down the whole browser instance. Whether the cause is a bug in the browser itself, a malicious script, or a badly-written plug-in, the effect is the same; not only does the tab that caused the problem disappear, so does the tab with your half-composed forum post, the train timetable you need to get home, and the audio stream you're listening to.

IE8 tackles this by separating each tab into its own process, a feature it calls "Loosely Coupled IE." Starting IE8 actually creates two processes; one process for the window frame, address bar, toolbar, and tab bar, and a second process for the tab itself. Subsequent tabs may also open in new processes. Running a tab in its own process allows that tab to crash (for any reason) without disrupting any other tab. This feature was present in Beta 1; in Beta 2, Microsoft has worked to reduce the overhead it causes and improve its performance. For example, now the processes creating the window frames are merged, so starting IE several times will only create new tabs in the existing frame.

The ratio between tabs and processes is not exactly 1:1; although this provides the most isolation, the ratio of processes to tabs will depend on machine capabilities. This process separation also resolves a major annoyance in IE in Windows Vista. In Vista, sites in different zones cannot be open in the same IE window. A file opened from the hard disk cannot coexist with a file opened from the Internet; instead, two different IE processes are required, one for each security zone. Because IE8 uses different processes for each tab, this restriction is lifted; different security zones will still use different processes behind the scenes, but they will be able to share the same window.

The final piece of the puzzle is Automatic Crash Recovery. As with LCIE, this was present in Beta 1, but has been improved for Beta 2. ACR is designed to improve the experience when the the inevitable occurs and a tab crashes. Instead of losing everything you were doing in the tab, ACR restarts the process and restores the tab's context—in Beta 1, this meant it opened the same URL and kept the back/forward browser history.

ACR had promise in Beta 1; however, it neglected to recover the most important things—text entered into forms, and session cookies. Without these, the experience is a little frustratring; the browser reopens the right page, but you find yourself logged out and with your half-written e-mail gone. Beta 2 fixes this by recovering both form data and session cookies. This means that Beta 2 will be able to put you right back where you were before the tab crashed, with virtually no interruption.

As well as being incomplete, ACR in Beta 1 was not itself particularly reliable; it was easy to make the browser get into a never-ending cycle of crashing, restarting, recovering, and then immediately crashing (because the URL being recovered caused the crash in the first place). Microsoft has not said anything about whether this will continue to be a problem.

Of course, while better handling of crashes is no bad thing, it would be even better for the browser not to crash in the first place. Microsoft has long had an (opt-in) system for reporting crashes and hangs back to the company—Windows Error Reporting (aka, Watson). This data allows the company to locate bugs and determine which are in need of the most attention. On the blog, the IE team stated that they have committed to fixing the top 50 percent of all the Watson errors they have; this should provide a significant boost to reliability.

When IE8 is released later this year it will undoubtably be the best version of Internet Explorer ever. IE's competition is improving all the time, and gaining in popularity, and—at least when it comes to standards compliance—is already superior today to what IE8 will deliver later in the year. Microsoft's uphill battle to stop the rot and turn IE around is far from over.

Short wavelength ultraviolet semiconductor laser diodes

Modern electronics have been reaping the benefits of cheap, compact semiconductor lasers for years now. From the revolutions in CD and DVD optical data storage, through seemingly simple devices like barcode scanners, to high-powered industrial lasers for material fabrication, the semiconductor laser has been indispensable. Researchers have now successfully produced semiconductor laser diodes that produce light in the ultraviolet range, where short wavelengths promise a new level of precision scientific applications, industrial processes, and an increase in optical data storage densities. HangZhou Night Net

Several different methods exist for producing the coherent light that defines a laser, ranging from gas excitation and chemical reactions to exotic free-electron lasers. Semiconductor lasers differ in that they are a single component with no moving parts, making them very compact. The added bonus is that they can be produced cheaply and efficiently using known semiconductor production techniques widely adopted by industry. Different wavelengths of semiconductor lasers exist, from infrared to deep blue and near ultraviolet, each using a unique semiconductor material that generates photons when its electrons relax from their excited state.

Ultraviolet semiconductor lasers have not suffered due to the lack of a candidate material, but rather from a lack of precision in their fabrication. A close relative of the semiconductor laser, the LED, has been able to generate deep ultraviolet light using various combinations of gallium, indium, and aluminum, along with some added nitrogen. Fabricating diodes with these materials, however, usually results in excessive stress and cracking—not a problem for LEDs, but a show-stopper for laser diodes. The use of indium allows for successful fabrication of a laser diode, but that shifts the output light to a longer, less desirable, wavelength.

Researchers set out to make the first indium-free aluminum-gallium-nitride laser diode that has a low enough defect density to produce laser emissions. The plan of attack included a precision deposition technique called (get ready for it) hetero facet-controlled epitaxial lateral overgrowth, or hetero-FACELO. Various layers of the AlGaN with differing atomic ratios were deposited on a sapphire substrate to form waveguides, electrical contacts, claddings, and the all-important multiple quantum well layer, which acts as an electron trap and amplifier of sorts. The resulting laser diode produces a tight output peak at 342.3nm, with a broadening and shift of the peak when lower currents are used to operate the diode.

Although your supermarket's barcode scanner may not be in dire need of an ultraviolet laser, the high-tech sector is always in need of the more precise fabrication and detection equipment, which an ultraviolet laser can offer. With ultraviolet laser emission now possible in the form of a semiconductor diode, this opens the door for the next generation of laser-based consumer devices.

Nature Photonics, 2008. DOI: 10.1038/nphoton.2008.135