Newly-found hybrid attack embeds Java applet in GIF file

Researchers at NGSSoftware have developed a hybrid attack capable of hiding itself within an image and intend to present details on the exploit at the Black Hat security conference next week. New and esoteric attacks are part and parcel of what Black Hat is about, but this particular vector could target web sites with a particularly vulnerable population: MySpace and Facebook. Social networking web sites tend to attract younger users, and while this particular attack can be used in a variety of ways, embedding the hook in profile photos that are then seeded and targeted at the teen crowd could be a very effective tactic. HangZhou Night Net

The full details of the attack won't be available until next week, but Network World has managed to glean some key facts on its operation. The NGSSoftware team has found a way to embed a Java applet within a GIF; the hybridized file is referred to as a GIFAR. Just to make it clear, this is a file extension of convenience and not the literal name of any particular file type. The GIFAR exploit works because two different programs see the same file differently. The web server that actually holds the file sees it as a GIF file, and serves it accordingly, but when the "image" actually reaches the client, it's opened as a Java applet and run.

Simply viewing a GIFAR won't infect a system; the attack method requires that the user be linked to the hybridized infection from an appropriately malicious web site. Despite its name, this attack method is not limited to GIFs; ZDNet's Zero Day blog has additional information on the exploit, and states that a number of files could be combined with .JAR, including both JPEGs and DOCs. This seems to indicate that one could actually hide a Java applet inside another Java applet, and then tie both of them together with a BINK file, but the resulting mess would probably fail, even as comedic relief.

The root of the problem isn't within Java itself, but results from weak web application security. ZDNet's blog entry implies that the attack vector might be significantly reduced if web applications would actually parse a file's contents, rather than simply checking the extension. The research team will leave some details of the attack out in their presentation, to prevent immediate exploitation, and Sun intends to issue a patch that will serve as a short-term correction to the problem.

Mapping the peculiar velocities of stars

All things dark are all the rage is cosmology at the moment. There is dark matter—a type of matter that only weakly interacts with light. And dark energy—the label used to denote the observed increase in the rate of expansion of the universe. Our knowledge of what dark matter is and what dark energy denotes is woefully inadequate, opening up a theoretician's paradise. There are all sorts of models out there and, in the case of dark energy, they all have to fit one data point, making it kind of trivial to obtain a good result. In the meantime, astronomers are scrabbling around—in, yes, the dark—figuring out how to obtain more precise measurements of the increasing acceleration of the universe. HangZhou Night Net

In particular, there are a set of models that predict that the distribution of dark energy is not uniform, meaning that measurements of the velocity of stars at different distances and directions should be able to tell theoreticians whether barking up this particular tree is worthwhile. However, there is a problem: it is quite difficult to measure these velocities. Locally, astronomers use Type Ia supernova as references for distance and speed, but the further away the supernovas are, the weaker the signal, and the more significant confounding sources of noise become.

One source of noise is gravitational lensing, which causes an apparent change in the brightness of the supernova, resulting in an incorrect distance calculation. A pair of Chinese astronomers have now examined the problem and showed that the signature of gravitational lensing can be removed.

A gravitational lens will often smear the image of the star into an arc shape, depending on the relative location of the star, the lens, and the telescope. The behavior of the lens is relatively static and its influence can be calculated in two dimensions by examining the correlations between points on the image and calculating the spatial frequencies of those correlations—dark matter can be observed through this method.

However, this 2D power spectrum does not allow a correction to be made for the distance and velocity of the star. To do that, the researchers performed the correlation and power spectrum calculations in 3D. The supernova light has most of its power along the line of sight, while the lens power spectrum remains 2D and at right angles to the line of sight. This effectively separates out the contribution of the lens, allowing researchers to correct for gravitational lensing.

So, this seems like a pretty obscure bit of research to put on Nobel Intent, but I think it is important to show these slightly less sexy parts of the scientific process. Should dark energy models with a non-isotropic distribution of dark energy prove correct, measurements derived from observations of Type Ia supernova will play a critical role in confirming them. Before that can happen, these sorts of problems need to be solved.

To give you some insight into how important issue is to the astronomy community, during the time this paper was being written and going through peer review, four other papers on the topic were published or accepted for publication, presenting other ways to solve the same problem.

Physical Review D, 2008, DOI: 10.1103/PhysRevD.78.023006

Next up for pointless gaming laws? Illinois and FFXI

New York was the last state to pass a law forcing gaming companies to do something the already do, and it was such a great use of time and money that Illinois had to get into the action. Following the trials of two parents trying to cancel a Final Fantasy XI account, the state passed a law saying that online games had to have a way to cancel your account online. HangZhou Night Net

The summary of the bill, which was signed into law on Tuesday, follows:

…an Internet gaming service provider that provides service to a consumer… for a stated term that is automatically renewed for another term unless a consumer cancels the service must give a consumer who is an Illinois resident: (1) a secure method at the Internet gaming service provider's web site that the consumer may use to cancel the service, which method shall not require the consumer to make a telephone call or send U.S. Postal Service mail to effectuate the cancellation;

and (2) instructions that the consumer may follow to cancel the service at the Internet gaming service provider's web site. Makes it an unlawful business practice for an Internet gaming service provider to violate the new provisions.

I passed this over to our own Frank Caron. Caron, a while back, decided to work on his pasty Canadian complexion and canceled his Final Fantasy XI account in order to spend more time outside. How did he do it? He used the PlayOnline software that comes bundled with the game. As Frank points out, canceling your account is possible online, even if the software may seem obtuse to those who aren't familiar with this sort of service. "Besides, there are plenty of help files and 'contact us' notices to help guide users," he noted.

The law has good intentions, but are there many online games that doesn't allow you to do this? Was this a major problem? Don't you think someone would have looked into this a little more closely before writing the legislation? Sadly, we know the answer to that last question.

Thanks to GamePolitics for the heads up on this story. What do you guys think? Is canceling FFXI trickier than it needs to be? Do other games need to make this process more use-friendly? Sound off.

Portugal’s 500K Classmate PC order a nail in OLPC coffin

The government of Portugal has announced plans to launch a new education technology program called the Magellan Initiative, which aims to bring low-cost mobile computers to half a million young students. The laptops, which are being developed in collaboration with Intel, will be based on the company's Classmate PC reference design. HangZhou Night Net

"This new collaboration with Intel underscores Portugal's commitment to advance quickly toward a knowledge-based economy," said Portugal's Prime Minister, Jose Socrates. "By equipping our schools with state-of-the-art computing technology and Internet connectivity, we hope to hasten the transition to economic models that benefit our citizens."

The deal is a major victory for Intel's Classmate PC concept which has been battling for mindshare against the One Laptop Per Child (OLPC) group, a nonprofit organization that emerged from the MIT media labs. OLPC has been afflicted with numerous setbacks and is presently mired in internal disputes that raise serious questions about the project's long-term viability. The Classmate PC, which doesn't suffer from OLPC's lack of proper distribution infrastructure or extreme dependence on scale, is beginning to look like a more viable long-term solution. OLPC has secured only 600,000 total orders worldwide.

Although the Classmate PC uses much less innovative hardware than OLPC's unique XO, the Classmate's more conventional approach and strong support for Microsoft's Windows operating system have given it a bit of an edge. Governments have been reluctant to commit to bulk orders of OLPC's largely unproven product. OLPC has attempted to combat this by offering XP, a move that has alienated OLPC's open source supporters.

Intel and OLPC have had a strange relationship, alternating between mutual condemnation and vows of partnership. OLPC insisted that low-cost education laptop efforts should be consolidated and centralized so that economies of scale could be optimally leveraged. To this end, OLPC insisted that Intel should drop its Classmate PC project. Intel argued that having multiple options, suppliers, and distribution channels was a better and more sustainable approach. When a compromise couldn't be reached, the tenuous relationship was abandoned.

Various systems based on the Classmate reference design are already being sold in several countries. The existing models range from about $250 to $500 in price and include Intel Atom processors and 1GB of RAM. The precise configuration and cost-per-unit for the Portuguese model has not yet been disclosed. Intel is said to be working on an update to the hardware design, the details of which are expected to emerge next month.

Intel is also helping Portugal launch a new web-based interactive learning system called Intel skoool that is designed to provide resources for teaching math and science (but obviously not spelling). The web site was launched last November and continues to grow.

This looks like another nail in the coffin for the OLPC project. Governments around the world will likely be watching the progress of Portugal's Classmate rollout and it could answer a lot of questions about the efficacy of Intel's approach to low-cost education computers.

Being a better gamer: a guide to changing the world

Truth, Lies, and Video Games

Apathy is for losers. Fine, you
don't have time to start a charity or fight Jack [Thompson], then let your work do
the talking. Your collective creative output is the real ambassador
that touches millions on a global basis. Games have the ability to
transform the world. Don’t lose sight of that. You create culture. We
ARE culture.HangZhou Night Net

Like it or not, you are all already ambassadors for games. So, better make the most of that responsibility! Award or not, I can’t ever do that for you.

These words were spoken by Jason Della Rocca as he won the Ambassador Award at the Game Developers Conference this year. Della Rocca has done wonderful work for the International Game Developers Association, but his speech was inspiring: it's everyone's job to put a better face on gaming, because whether you're a hardcore shooter fan or you simply enjoy a few rounds of Peggle, you're taking part in an industry that many don't understand, and some would like to control. We've recently had the pleasure of speaking to some of the best minds currently working to help the gaming industry grow up, and to some very giving people who seek to put a human face on the suffering of many and alleviate some of that pain. Their instruments? Video games?

The video industry is growing up. Luckily for us, many gamers are growing up with it, and we have many capable hands guiding the art form and protecting it from the misinformation about what gaming is, and the type of people who enjoy it.

The battle won't be won in the courts

If the mainstream media is to be believed, video games are about as healthy as cigarettes, except they cause psychotic breaks instead of cancer. Politicians use this artificially created fear of games as an excuse to pass legislation to criminalize the sale of certain games, although so far, each of these pieces of legislation has been successfully challenged and overturned in the courts. While more people than ever are playing games, and the audience for those games is expanding to include people of both sexes and all ages, the media's reporting on the industry and its effects on children remains reactionary and woefully misleading.

Being a good person and a gamer is easier than you think.

One of the biggest misconceptions about games is that they're the domain of adolescent boys. According to the Entertainment Software Association, the average age of gamers is now 35. Approximately 60 percent of gamers play video games with their friends, 40 percent of gamers in the US are women, and almost a quarter of all video game players are over the age of fifty. If video games are so popular and widely enjoyed, why are they still vilified in the media?

Who actually plays video games? The ESA's findings might surprise you.

The full answer to that question is complicated, but the short-and-simple version is that, all survey data aside, video games are still a new form of media that older generations by-and-large rarely see or experience first-hand. Human nature is simple: we fear what we don't understand.

"[The demographics for the ESA survey] would mean that for every 16-year-old kid there's a 50-year-old guy to average out to 33, and that’s just not what I see when I go to GameStop or when I read comments online," said Dennis McCauley of GamePolitics when he talked with Ars. "I think gaming is still largely a youth culture thing, at least in the sense of the most involved gamers." McCauley's logic makes sense: enjoying games like TextTwist, MineSweeper, and Solitaire might make older players more amenable towards video games in general, but that doesn't mean they fully understand the nature of the industry. Will Wright gave a speech where he pointed out that games aren't the first new trend to be distrusted. He told us about how he came across an article where the writer described a young man engrossed in some unfamiliar handheld object, apparently to the point where he was out of touch with his surroundings and didn't want to interact with others because it distracted him from his new obsession.

It turns out this account came from a medieval monastery; the object in question was a book in the hands of a young monk. Everything, when it's new to society, has to prove its worth.

Games are still perceived by many as toys, and toys are meant to be played with by children. But the idea of little tykes playing with mature content is more than enough to get concerned parents up in arms. "I think [this attitude is] due in part to the misperception that video games are primarily intended for kids," said Patricia Vance of the ESRB. "When you juxtapose that mis-perception with the presence of mature content in a video game, it causes concern. The more games are thought of in the same way as movies and TV shows, the more acceptance they’ll gain, and that’s been happening more and more in recent years."

Some of the biggest opponents of games in recent political history (clockwise from top left): Hillary Clinton, Elliot Splitzer, Andrew J Lanza, and Leland Yee.

These two ideas about games are at odds: one group sees them as toys with inappropriate content, and the other sees an emerging art form that is giving movies and music a run for their money, quite literally. Now that games have managed to infiltrate the popular culture at large, the politicians are becoming even more alarmed, and the expensive campaign to pass laws controlling the content of games and their sales may find new allies.

Still, some in the industry are hopeful, and they think these attempts will trend downward. "[The political climate]'s still dicey, but a little better than it has been. The good news is that the number of serious legislative attempts to regulate games at the state level has dropped off over the last two years," said McCauley. "Right now, only Massachusetts is pushing legislation, and that bill seems stalled. The California law is still under appeal by Gov. Schwarzenegger, and we're waiting to see if Minnesota decides to take its bill to the US Supreme Court. They have exhausted their other avenues of appeal. New York had a bill that looked to be on the fast track to passage until Gov. Spitzer got caught going the GTA hooker route." Since McCauley talked to Ars, the New York bill has been signed into law by Gov. David Paterson (Spitzer's replacement), and the industry is potentially going to challenge it in court.

Politicians have made a point of trying to limit the sales and content of games they find morally objectionable, but the industry has managed to legally overturn every piece of legislation with an undefeated record of 9-0. This intimidating track record might not stall every attempt, but it sends a powerful message. The fact that gamers are making up an increasing percentage of voters might have a little something to do with it, too. At least part of this recognition is due to groups like the Entertainment Consumers Association, the Entertainment Software Association, and the ESRB making an effort to educate both the general populace and elected officials about the reality of video games. The gamers themselves have stepped up, and have become their own PR machine, showing the world that our favorite hobby is enjoyed by good, upstanding people.

First responders to FCC: give up national D Block pipe dream

They packed Brooklyn, New York's elegant Borough Hall yesterday—police and fire department officials, telco lawyers, and a former state Attorney General—lining up to tell all five members of the Federal Communications Commission how it should set up a national broadband public safety communications system. But before they spoke, FCC Commissioner Michael Copps laid out the truth: years into the process, the agency still isn't even sure what to do. HangZhou Night Net

"What purpose should this network serve?" Copps asked in his opening statement at the en banc hearing. "Video? Data? A backbone that connects existing voice networks? Or a whole new, built-from-scratch network that can do it all?"

How much will it cost? Copps wondered. And most perplexing: "Should licenses be regional or national?"

Actually, many panelists at the hearing had an answer to that last question. Drop the vain attempt to auction off the so-called 700MHz "D Block" to one national licensee, they told the FCC. Instead, allow public safety agencies to access the spectrum and build their systems on a regional basis instead.

"There is nothing wrong, per se, with the goal of nationwide interoperability," argued James Farmer, former counsel to the 9/11 Commission as well as former New Jersey Attorney General. "The way that emergencies actually happen, however, suggests that the best way to achieve interoperability nationwide is by first building it locally and regionally, and then interconnecting the regional interoperable networks."

The head of New York City's Communications Division put it more plainly. "The Commission should not re-auction the 'D' Block spectrum," Charles Dowd said, "and should appeal to Congress to permit the allocation of this spectrum immediately to public safety."

The position that the government should authorize a broadband based public safety communications network built, "from the bottom up," as Farmer put, comes in the wake of the single failed portion of the FCC's recent 700MHz auction, concluded in mid-March. That massive sale of now defunct analog TV channels raked in over 19 billion dollars, except for the spectrum reserved for public safety. A successful bidder on the D block's two hunks of bandwidth would have leased them for commercial purposes from a public safety administration agency. But nobody bid the FCC's minimal $1.3 billion asking price for the block.

A subsequent FCC audit found that potential buyers were scared to fork over the money, fearful that the public safety burdens of the public/private partnership would leave them with many obligations and too little revenue.

Chop it up…

So in May the FCC launched a new proceeding on what to do about the D Block. Since then it has gotten an earful from cities across the country who plead that they can't wait for the Commission to figure this one out. Chop that block into regions, they say, and let us build our own networks in partnership with commercial vendors. American cities have demonstrated that they can do the job "without the need for a public-private nationwide network," Paul Crosgave, New York's Director of Information Technology told the hearing.

Not surprisingly, the potential commercial partners for these proposed regional networks appeared the FCC hearing, showing great enthusiasm for the idea. AT&T's marketing development director Stacey Black recommended that instead of auctioning off the D Block on a national basis, give it over to the designated public safety administrator and let it develop national interoperability guidelines. Cities and regional safety planners would submit RFPs (Request for Proposals) that conform to the specs, then build their own systems.

…or keep it whole?

But there are more than a couple of problems with this line of thought, defenders of the national D Block plan noted. Robert Gurss of the Association of Public Safety Communications Officials-International countered that, unlike New York City, many suburban and rural parts of the United States don't the have resources to build their own regional public safety communications infrastructures.

And some commenters worried whether a regionally-based plan can be unified without strong national oversight. "From my perspective, if you don't have some kind of leadership, some kind of national governance" to make even a cross-regional, inter-related system happen, "it will never happen," warned Harlin McEwen, Chair of the Public Safety Spectrum Trust Corporation, which will administer the public safety aspect of the D Block. "History has told us that we have lots of incompatible local systems that get built out with local requirements without any national consideration."

Queries about funding a regional scheme repeatedly popped up during the hearing, with a sensible observation coming from William J. Andrle, Jr. of Northrop Grumman Information Technology. "It does bear mentioning again that this exercise might be unnecessary if there was at least some direct funding from the federal government," Andrle stated during his testimony. "We have federal matching funds for highways, the environment, and other needs. Why nothing—zero —to facilitate the establishment of interoperable public safety broadband services that will enable first responders to better protect the public?"

Can the FCC change course?

Then there's the question of whether the FCC even has the legal right to drop its D Block plan and adopt a regional approach. A skeptical Commissioner Robert M. McDowell asked regional plan supporters AT&T and Verizon whether they thought the agency could legally make the switch.

"Did I hear AT&T and Verizon correctly?" McDowell said. "You don't know if we've got the statutory authority."

"I do not know, Commissioner," replied Verizon Vice President Don Brittingham.

"Yet you're making this proposal, without knowing… " an obviously bemused McDowell continued.

"Let me be more clear," Brittingham explained. "As a lawyer I don't know technically whether you have authority or not. But even if you don't currently what Verizon is saying is that in order to solve this problem in the best interest of public safety, Congress should step in and do that. There should be legislation to give you the authority to reallocate the spectrum."

Will Congress step in soon? McDowell wondered. AT&T's Black offered a few hopeful words. Larry Krevor, Vice President of Sprint-Nextel, was a little more candid with the Commission. "I think at this point you have to auction the spectrum," he advised. The FCC would have to go back to Congress to set up a different plan. "And certainly with the US about to have an election and a new Congress and new Committee assignments, etcetera, the Hill will probably not move as quickly as we might like."

Towards the end of the hearing, McDowell promised that the Commission will make a decision on the D Block, but also confided his frustration. "The bottom line is that there is no bottom line right now," he said. "There is no critical mass behind a particular set of concepts other than something needs to be done. It's hard to find agreement. So again, I will recommend a strong prayer for the FCC."

Further reading:

Testimony of the panelists at the FCC's hearing on the D Block

IE8 Beta 2 getting heavy performance, crash-recovery tweaks

More details about Microsoft's next version of its ailing browser have been released, in the build-up to the second beta release due next month. The first beta, released in March, was aimed at web developers. It brought much-needed improvements to standards compliance, along with negligible reliability and inconsistent performance. HangZhou Night Net

Beta 2 is aimed at a general audience; not just web developers who need early access to IE8 to find out what breaks (and what finally works), but a broader audience including IT staff evaluating the next browser version so they know whether to deploy it as well as end-users who just have to run the latest version of everything even if it isn't quite finished yet. As well as the all-important standards compliance, IE8 brings a raft of new security, reliability, and management features. The official IE blog has described some of these already, and on Monday gave more details about what to expect in beta 2.

With IE8, Microsoft is attempting to solve one of the most annoying problems with today's multi-window, multi-tab browsers; namely, the disastrous effect that a browser crash has. It is an unfortunate feature of most browsers that a crash in one tab takes down the whole browser instance. Whether the cause is a bug in the browser itself, a malicious script, or a badly-written plug-in, the effect is the same; not only does the tab that caused the problem disappear, so does the tab with your half-composed forum post, the train timetable you need to get home, and the audio stream you're listening to.

IE8 tackles this by separating each tab into its own process, a feature it calls "Loosely Coupled IE." Starting IE8 actually creates two processes; one process for the window frame, address bar, toolbar, and tab bar, and a second process for the tab itself. Subsequent tabs may also open in new processes. Running a tab in its own process allows that tab to crash (for any reason) without disrupting any other tab. This feature was present in Beta 1; in Beta 2, Microsoft has worked to reduce the overhead it causes and improve its performance. For example, now the processes creating the window frames are merged, so starting IE several times will only create new tabs in the existing frame.

The ratio between tabs and processes is not exactly 1:1; although this provides the most isolation, the ratio of processes to tabs will depend on machine capabilities. This process separation also resolves a major annoyance in IE in Windows Vista. In Vista, sites in different zones cannot be open in the same IE window. A file opened from the hard disk cannot coexist with a file opened from the Internet; instead, two different IE processes are required, one for each security zone. Because IE8 uses different processes for each tab, this restriction is lifted; different security zones will still use different processes behind the scenes, but they will be able to share the same window.

The final piece of the puzzle is Automatic Crash Recovery. As with LCIE, this was present in Beta 1, but has been improved for Beta 2. ACR is designed to improve the experience when the the inevitable occurs and a tab crashes. Instead of losing everything you were doing in the tab, ACR restarts the process and restores the tab's context—in Beta 1, this meant it opened the same URL and kept the back/forward browser history.

ACR had promise in Beta 1; however, it neglected to recover the most important things—text entered into forms, and session cookies. Without these, the experience is a little frustratring; the browser reopens the right page, but you find yourself logged out and with your half-written e-mail gone. Beta 2 fixes this by recovering both form data and session cookies. This means that Beta 2 will be able to put you right back where you were before the tab crashed, with virtually no interruption.

As well as being incomplete, ACR in Beta 1 was not itself particularly reliable; it was easy to make the browser get into a never-ending cycle of crashing, restarting, recovering, and then immediately crashing (because the URL being recovered caused the crash in the first place). Microsoft has not said anything about whether this will continue to be a problem.

Of course, while better handling of crashes is no bad thing, it would be even better for the browser not to crash in the first place. Microsoft has long had an (opt-in) system for reporting crashes and hangs back to the company—Windows Error Reporting (aka, Watson). This data allows the company to locate bugs and determine which are in need of the most attention. On the blog, the IE team stated that they have committed to fixing the top 50 percent of all the Watson errors they have; this should provide a significant boost to reliability.

When IE8 is released later this year it will undoubtably be the best version of Internet Explorer ever. IE's competition is improving all the time, and gaining in popularity, and—at least when it comes to standards compliance—is already superior today to what IE8 will deliver later in the year. Microsoft's uphill battle to stop the rot and turn IE around is far from over.

Short wavelength ultraviolet semiconductor laser diodes

Modern electronics have been reaping the benefits of cheap, compact semiconductor lasers for years now. From the revolutions in CD and DVD optical data storage, through seemingly simple devices like barcode scanners, to high-powered industrial lasers for material fabrication, the semiconductor laser has been indispensable. Researchers have now successfully produced semiconductor laser diodes that produce light in the ultraviolet range, where short wavelengths promise a new level of precision scientific applications, industrial processes, and an increase in optical data storage densities. HangZhou Night Net

Several different methods exist for producing the coherent light that defines a laser, ranging from gas excitation and chemical reactions to exotic free-electron lasers. Semiconductor lasers differ in that they are a single component with no moving parts, making them very compact. The added bonus is that they can be produced cheaply and efficiently using known semiconductor production techniques widely adopted by industry. Different wavelengths of semiconductor lasers exist, from infrared to deep blue and near ultraviolet, each using a unique semiconductor material that generates photons when its electrons relax from their excited state.

Ultraviolet semiconductor lasers have not suffered due to the lack of a candidate material, but rather from a lack of precision in their fabrication. A close relative of the semiconductor laser, the LED, has been able to generate deep ultraviolet light using various combinations of gallium, indium, and aluminum, along with some added nitrogen. Fabricating diodes with these materials, however, usually results in excessive stress and cracking—not a problem for LEDs, but a show-stopper for laser diodes. The use of indium allows for successful fabrication of a laser diode, but that shifts the output light to a longer, less desirable, wavelength.

Researchers set out to make the first indium-free aluminum-gallium-nitride laser diode that has a low enough defect density to produce laser emissions. The plan of attack included a precision deposition technique called (get ready for it) hetero facet-controlled epitaxial lateral overgrowth, or hetero-FACELO. Various layers of the AlGaN with differing atomic ratios were deposited on a sapphire substrate to form waveguides, electrical contacts, claddings, and the all-important multiple quantum well layer, which acts as an electron trap and amplifier of sorts. The resulting laser diode produces a tight output peak at 342.3nm, with a broadening and shift of the peak when lower currents are used to operate the diode.

Although your supermarket's barcode scanner may not be in dire need of an ultraviolet laser, the high-tech sector is always in need of the more precise fabrication and detection equipment, which an ultraviolet laser can offer. With ultraviolet laser emission now possible in the form of a semiconductor diode, this opens the door for the next generation of laser-based consumer devices.

Nature Photonics, 2008. DOI: 10.1038/nphoton.2008.135

Hands on: Labmeeting’s social networking for researchers

Social networking is arguably one of the biggest Internet development over the past few years. From sites purely for fun like MySpace and Facebook to career networking like LinkedIn, new social networking operations are springing up to serve smaller discrete communities. One of the latest of these is Labmeeting, designed specifically for scientists. HangZhou Night Net

The site describes itself as being there to "help with those things that make doing science needlessly difficult," such as finding collaborators or competitors, rescheduling meetings, and sharing publications and protocols. Personally, I've not found any of those things needlessly difficult, but maybe that's just me.

You'll need an academic e-mail address to register, and the site is only really useful to those of us in life sciences; physicists, social scientists, et al need not apply.

The site lets you search for colleagues in the "search for people" text box. Trial and error revealed that some keywords also work in this box—by entering the names of my current and former institutions I was able to find a few people from both (although no one that I know), but it might behoove the site to disambiguate this a little.

The Papers section of Labmeeting

It feels like it could still do with some work in other areas. Although I was able to search for publications in PubMed, and eventually upload a file, the endnote library importer wasn’t working at the time of writing. If a publication you find has an author with the same lastname and first initial as you, Labmeeting allows you to identify yourself as one of the authors, and the paper is uploaded to your profile.

Finally there are tools that allow you to share documents and calendars with other lab members, but since there were only five individuals from my institution (including me) and none of them are from my lab, I was unable to put this to the test. While it's a nice idea, I'm a little dubious that it will really see widespread adoption. Firstly, you would need to have a Principal Investigator (PI) that knew about it, and then made sure everyone in the lab signed up; in reality, it's a lot easier to just e-mail everyone.

The online paper management seems more useful, although as a Mac user I'm an avid supporter of Papers, which might not store things online but is the best way to manage a literature collection I’ve come across.

Assuming Labmeeting can do a good job of making itself known to the scientific community, it might become more widespread. TechCrunch notes that the site has plans to offer its services to drug companies, biotechs and other industry types, for which it will charge a fee—although presumably only if they can make a case for being better than an intranet. There is stiff competition out there though, with sites like Nature Networks, SciLink, Epernicus, and even old timers like Web of Sscience. As with much else in science, peer review will make or break them.

NSF’s Internet GENI testbed gets money, Internet2 bandwidth

Last spring, the National Science Foundation announced the launch of a project, termed GENI, designed to be a testbed for the academic community. After several years of planning, the NSF awarded a contract to BBN Technologies (creator of ARPANET), which would oversee the development of a sandbox for computer science researchers to test next-generation networking technology and software. This past week has seen two big announcements from BBN: a first round of grants and lots of dedicated bandwidth. HangZhou Night Net

First, the money. Last week, BBN announced that the NSF had sprung a grant worth $12 million spread over three years. That money will be divided up among as many as 29 institutions, both academic and industrial, that have been chosen through a peer-reviewed application process. The grants will fund what BBN is terming "Spiral 1," a set of prototypes for the infrastructure of the GENI experimental network. Based on earlier planning announcements, these projects will likely focus on the creation of a programmable network infrastructure that's flexible enough to handle the experimental work that will follow.

All of that work would be in vain if there were no place to run the traffic, but the news for BBN is good here, too. Internet2, which upped its inter-institutional bandwidth to 100Gbps last year, announced an agreement with BBN in which 10Gbps would be set aside for use on GENI. Internet2 has always separated a fraction of its capacity for use in network research; it's not clear whether this represents an additional commitment to research, or whether the GENI work will subsume any earlier projects pursued by the group.

The science community has seen a number of major projects that were funded for part of the development phase, and then cut in the face of budget constraints. GENI appears to be safely navigating the funding process so far, which is good news for those who hope to see an improvements in both network infrastructure and the applications that run on it.

As we noted in our report on academic networking, however, it's important not to confuse these research networks with the Internet we'll be using a decade from now; the needs and goals are quite distinct. Unfortunately, the Associated Press appears not to have gotten that memo, as they termed the work "a massive project to redesign and rebuild the Internet from scratch," terminology they appear to have picked up from the title of a 2005 talk (PDF) that dates from when GENI was still in the planning phases.

Analysis: why Apple won’t drop Intel chipsets any time soon

A recent rumor making the rounds suggests that Apple will be switching from Intel chipsets for its products to using chipsets made by one of Intel's competitor, either AMD or VIA. I'm skeptical, but the rumor has gotten enough traction that it's worth taking a closer look at it. HangZhou Night Net

If Apple will use non-Intel chipsets, then the first question that must be answered is, where? In desktops, laptops, or both? Let's take the desktop chipset possibility first.

The only desktop chipset replacement for Intel that I realistically could see Apple using, given what I know of the company and its current preference for all things CUDA (look for some GrandCentral coverage before long) is an NVIDIA part, and the only reason I could see them using NVIDIA is to roll out a tower with dual-GPU capabilities. (Intel's Bloomfield will do SLI with NVIDIA GPUs, but probably not as well as comparable NVIDIA products.)

The NVIDIA SLI scenario is mildly plausible, given how seriously Apple takes data parallelism. The company has long had internal "GPGPU" efforts aimed at providing internal developers with ways to use the GPU to speed up their apps, and Snow Leopard will represent a leap forward in Apple's OS-level support for multicore and data-parallel coprocessors. So a Snow Leopard plus NVIDIA SLI combo could be a match made in media processing heaven.

The problem with this theory, however, is that Snow Leopard is scheduled to arrive sometime in the summer of 2009, which is also when Intel's Larrabee is set to launch. And I've heard from a source that I trust that Apple will use Larrabee; this makes sense, because Larrabee, as a many-core x86 multiprocessor, can be exploited directly by GrandCentral's cooperative multitasking capabilities.

But the real development that makes this chipset rumor implausible to me is Nehalem. Intel's Nehalem is due out at the end of this year, and if NVIDIA (or any other chipset maker) has a license for Intel's new QuickPath interconnect I'm not aware of it. So Apple would have to switch right back to Intel chipsets for their upcoming Nehalem towers. And indeed, those towers could very well be the mysterious margin-reducing products that Apple referred to on their conference call.

To turn our attention to mobiles, I can't think of a good reason for Apple to move away from Intel's mobile platform. The only possible exception here would be the MacBook Air, where Apple might like to pair Intel's custom-packaged Core 2 Duo with a more capable integrated graphics processor than what Intel's chipsets provide. I'm not sure if this is feasible, though, given that both the CPU and the chipset in the Air have special, reduced-footprint packages. NVIDIA would have to match this packaging effort, and that's unlikely.

Ultimately, I remain unconvinced by this latest round of speculation. Given what I know of Apple and Intel, and the two companies' software and hardware roadmaps, I'd expect them to get even cozier over the next year or two, not grow further apart.

New York Bar exam policy objects to Macs, Boot Camp

Macs may be enjoying new inroads and great sales records as of late, but there's one place you definitely won't find one of Apple's computers yet: The New York State Bar Examination. HangZhou Night Net

As our friends at TUAW have pointed out, April Dembosky at the New York Times reports that thousands of recent law school graduates are deep within the throes of their bar exam in New York. As recent as 2003, portions of the exam were opened to being completed on a notebook computer, though limited seating was offered only on a lottery basis. Last summer was the first time anyone with their own notebook could bring one in, though the board has a strict "can't blame it on the dog" policy when it comes to technology:

Technical difficulties may include hardware or software malfunctions, data saving or retrieval problems, operator errors, upload or download problems, or the loss of electrical power at the examination facility. In the event any technical difficulties occur during the bar examination, you must handwrite your essay answers in the answer books provided and no additional time may be allowed. If you choose to continue to use your computer to write your essay answers after experiencing technical difficulties, or when you have been instructed not to do so, you do so at your own risk.

To make matters worse for Mac users—even those who opt to set up Boot Camp and install Windows for the Microsoft-dominated law industry—the board's policy is just as strict on thinking different for the exam:

We do not support Apple products in any form including Intel-based laptops running Boot Camp — no exceptions.

The exam software uses various methods to lock out all other apps for the duration of the test to prevent Wikipedia from answering too many of the questions. Still, while the software is designed exclusively for Windows, the New York State Board of Law Examiners appear to be spooked out of their leather seats at the very notion of Apple hardware, despite the fact that some Macs run Windows just as well better than most PCs.

Judging from the rest of Dembosky's report, though, it sounds like the Windows software needs quite a bit of help before Mac-slinging law students can begin opining for a compatible version. Horror stories of software gone awry and nuking test answers during and after the exam have most students spooked, as only half of this summer's 12,000 candidates opted to forgo pen and paper.

Lord British wants to launch your DNA into space

Tabula Rasa has had some cool promotions. Not only did NCsoft send a number of lucky people on a Zero G flight, but now Richard Garriott is willing to digitize a select number of gamers' DNA and place it in a time capsule (dubbed the "Immortality Drive") that will be stored on the International Space Station when Lord British visits it in October. HangZhou Night Net

This project, dubbed "Operation Immortality," is Garriott's way of "saving" the human race by creating an archive of humanity's greatest moments, as well as giving gamers the opportunity to become a part of history… so long as they play Tabula Rasa. Players who register for the contest will be eligible to have their DNA sequenced and digitized, the theory being that if something awful happens to the human race, this archived genetic code could be used to help resurrect humankind. Anyone can go onto Operation Immortality's website and vote on what humanity's greatest moments are, and anyone with an active Tabula Rasa account will have their characters placed on the Immortality Drive, too. "This is your chance to leave your mark," said Garriott. "While everyone can participate in the polls… Tabula Rasa players will be the only gamers in the universe who can say that a piece of them is in space, since we're sending their in-game alter-egos, and for some, their DNA, to space with me."

Creating a "save game" for humanity is certainly a novel way of promoting the game, and it's definitely a neat way to involve people from around the world in creating a time capsule. The idea of voting on humanity's greatest achievements is definitely a nice touch, too, but I'm not entirely sure that aliens who find the capsule in a few thousand years are really going to be concerned about what our favorite movies and TV shows were….