Newly-found hybrid attack embeds Java applet in GIF file

Researchers at NGSSoftware have developed a hybrid attack capable of hiding itself within an image and intend to present details on the exploit at the Black Hat security conference next week. New and esoteric attacks are part and parcel of what Black Hat is about, but this particular vector could target web sites with a particularly vulnerable population: MySpace and Facebook. Social networking web sites tend to attract younger users, and while this particular attack can be used in a variety of ways, embedding the hook in profile photos that are then seeded and targeted at the teen crowd could be a very effective tactic. HangZhou Night Net

The full details of the attack won't be available until next week, but Network World has managed to glean some key facts on its operation. The NGSSoftware team has found a way to embed a Java applet within a GIF; the hybridized file is referred to as a GIFAR. Just to make it clear, this is a file extension of convenience and not the literal name of any particular file type. The GIFAR exploit works because two different programs see the same file differently. The web server that actually holds the file sees it as a GIF file, and serves it accordingly, but when the "image" actually reaches the client, it's opened as a Java applet and run.

Simply viewing a GIFAR won't infect a system; the attack method requires that the user be linked to the hybridized infection from an appropriately malicious web site. Despite its name, this attack method is not limited to GIFs; ZDNet's Zero Day blog has additional information on the exploit, and states that a number of files could be combined with .JAR, including both JPEGs and DOCs. This seems to indicate that one could actually hide a Java applet inside another Java applet, and then tie both of them together with a BINK file, but the resulting mess would probably fail, even as comedic relief.

The root of the problem isn't within Java itself, but results from weak web application security. ZDNet's blog entry implies that the attack vector might be significantly reduced if web applications would actually parse a file's contents, rather than simply checking the extension. The research team will leave some details of the attack out in their presentation, to prevent immediate exploitation, and Sun intends to issue a patch that will serve as a short-term correction to the problem.

Mapping the peculiar velocities of stars

All things dark are all the rage is cosmology at the moment. There is dark matter—a type of matter that only weakly interacts with light. And dark energy—the label used to denote the observed increase in the rate of expansion of the universe. Our knowledge of what dark matter is and what dark energy denotes is woefully inadequate, opening up a theoretician's paradise. There are all sorts of models out there and, in the case of dark energy, they all have to fit one data point, making it kind of trivial to obtain a good result. In the meantime, astronomers are scrabbling around—in, yes, the dark—figuring out how to obtain more precise measurements of the increasing acceleration of the universe. HangZhou Night Net

In particular, there are a set of models that predict that the distribution of dark energy is not uniform, meaning that measurements of the velocity of stars at different distances and directions should be able to tell theoreticians whether barking up this particular tree is worthwhile. However, there is a problem: it is quite difficult to measure these velocities. Locally, astronomers use Type Ia supernova as references for distance and speed, but the further away the supernovas are, the weaker the signal, and the more significant confounding sources of noise become.

One source of noise is gravitational lensing, which causes an apparent change in the brightness of the supernova, resulting in an incorrect distance calculation. A pair of Chinese astronomers have now examined the problem and showed that the signature of gravitational lensing can be removed.

A gravitational lens will often smear the image of the star into an arc shape, depending on the relative location of the star, the lens, and the telescope. The behavior of the lens is relatively static and its influence can be calculated in two dimensions by examining the correlations between points on the image and calculating the spatial frequencies of those correlations—dark matter can be observed through this method.

However, this 2D power spectrum does not allow a correction to be made for the distance and velocity of the star. To do that, the researchers performed the correlation and power spectrum calculations in 3D. The supernova light has most of its power along the line of sight, while the lens power spectrum remains 2D and at right angles to the line of sight. This effectively separates out the contribution of the lens, allowing researchers to correct for gravitational lensing.

So, this seems like a pretty obscure bit of research to put on Nobel Intent, but I think it is important to show these slightly less sexy parts of the scientific process. Should dark energy models with a non-isotropic distribution of dark energy prove correct, measurements derived from observations of Type Ia supernova will play a critical role in confirming them. Before that can happen, these sorts of problems need to be solved.

To give you some insight into how important issue is to the astronomy community, during the time this paper was being written and going through peer review, four other papers on the topic were published or accepted for publication, presenting other ways to solve the same problem.

Physical Review D, 2008, DOI: 10.1103/PhysRevD.78.023006

Next up for pointless gaming laws? Illinois and FFXI

New York was the last state to pass a law forcing gaming companies to do something the already do, and it was such a great use of time and money that Illinois had to get into the action. Following the trials of two parents trying to cancel a Final Fantasy XI account, the state passed a law saying that online games had to have a way to cancel your account online. HangZhou Night Net

The summary of the bill, which was signed into law on Tuesday, follows:

…an Internet gaming service provider that provides service to a consumer… for a stated term that is automatically renewed for another term unless a consumer cancels the service must give a consumer who is an Illinois resident: (1) a secure method at the Internet gaming service provider's web site that the consumer may use to cancel the service, which method shall not require the consumer to make a telephone call or send U.S. Postal Service mail to effectuate the cancellation;

and (2) instructions that the consumer may follow to cancel the service at the Internet gaming service provider's web site. Makes it an unlawful business practice for an Internet gaming service provider to violate the new provisions.

I passed this over to our own Frank Caron. Caron, a while back, decided to work on his pasty Canadian complexion and canceled his Final Fantasy XI account in order to spend more time outside. How did he do it? He used the PlayOnline software that comes bundled with the game. As Frank points out, canceling your account is possible online, even if the software may seem obtuse to those who aren't familiar with this sort of service. "Besides, there are plenty of help files and 'contact us' notices to help guide users," he noted.

The law has good intentions, but are there many online games that doesn't allow you to do this? Was this a major problem? Don't you think someone would have looked into this a little more closely before writing the legislation? Sadly, we know the answer to that last question.

Thanks to GamePolitics for the heads up on this story. What do you guys think? Is canceling FFXI trickier than it needs to be? Do other games need to make this process more use-friendly? Sound off.

Gartner: $100 notebook is several years off

Several projects, commercial, governmental, and charitable, have aimed at getting the price of laptops down, with $100 being the magical figure, but a new report from Gartner argues this won't be achieved for a number of years and isn't the right focus for now. HangZhou Night Net

The OLPC's XO PC was originally touted as a $100 PC, at the kind of low price that could easily put millions upon millions of laptops in the hands of third-world children. This would, its supporters claimed, cause a global renaissance, as computer exposure in the third world propagated technical education and other desirable skills in a worldwide cascade of economic development. Intel's Classmate PC had similar ambitions, although at a slightly higher price point. Both units, however, ran into price problems which raised their cost significantly. The XO, in particular, now costs almost twice as much as it was originally projected to. Similar difficulties have plagued the Asus' Eee PC, which was originally supposed to cost $200 and now costs $300 in its cheapest configuration.

Gartner projects that this scenario will continue for some time, and that the $100 laptop goal will not materialize for several years at least. Citing scaling difficulties and component costs, Gartner projects cost cannot fall more than 15 percent or so over the next few years. Even if this happened, the XO would still be sitting over $150 at that time, far short of its goals.

Instead, Gartner implores, the logistical and other details of the educational mission of the XO and its competitors should be explored and solved. A myopic focus on getting piles of cheap hardware out the door ignores, they say, problems of distribution, precise targeting of hardware to the needs of third-world users, and financing. Infrastructure for maintenance and repair, internet access, and appropriate curriculum development are also important. Focus on these ends will allow subnotebook vendors to better hit education markets with devices which can better help students, even if they are slightly more expensive than they could be, Gartner argued.

Since the beginning of the OLPC project, subnotebooks have spread into the consumer market in a big way, but Gartner seems to be predicting a strong degree of market segmentation between education and consumer devices. This is consistent with the present market; of available devices, only the Eee even remotely bridges the gap between consumer devices like the HP 2133 and education models like the Classmate PC and XO. Some frustration has emerged that OLPC hasn't sold the XO commercially, though, so this segmentation might be artificial.

Gartner is optimistic about the ultimate future of the platform for all kinds of users all over the world. "We expect to see increased product innovation in the PC market during the next few years," said Gartner research director Annette Jump. "Mini-notebooks will create opportunities to reach many buyers across all regions, both in mature markets as additional devices, and in emerging markets as PCs."

Subnotebooks have a glorious future ahead of them, but buying into it will cost more than one Benjamin.

Microsoft misses Windows Mobile goal by 2 million licenses

In an interview with Andy Lees, senior vice president of the Redmond company's Mobile Communications Business, Todd Bishop managed to grab some interesting facts about recent Windows Mobile developments. HangZhou Night Net

First, Microsoft sold more than 18 million Windows Mobile software licenses in the last fiscal year, which ended June 30, 2008. This was about 2 million short of Microsoft's widely publicized 20 million target. Lees explained that a few OEMs shipped their devices later than expected, and this was the reason for the shipment target being missed. He declined, however, to say which ones: "That would be unfair to the OEMs."

Secondly, Microsoft increased Windows Mobile's share of the worldwide mobile phone operating system market. According to IDC data, Windows Mobile unit sales have grown faster than the overall market, expanding from slightly more than 11 percent to just under 13 percent of the worldwide smartphone market. Two months ago, Microsoft announced it expected Windows Mobile sales to grow by at least 50 percent in the upcoming fiscal years 2008 and 2009.

Thirdly, Lees claimed the shortfall in Windows Mobile unit sales didn't have a material impact on revenue in the Entertainment & Devices Division. 2 million may seem like a large number, but the division has revenue coming in from various sources that don't depend on Windows Mobile software licenses.

After acquiring Danger in February, Microsoft has now announced that its subsidiary teamed up with T-Mobile USA to release the new T-Mobile Sidekick. The device features support for video capture, playback, and sharing; wireless stereo music and media sharing via Bluetooth; quick friend search and group chats in instant messaging; customizable Web browsing; universal search across all phone applications and data; and a customizable shell. Collaborations like this one are great for pushing Microsoft services onto new phones, but they won't help much in revenue or unit sales since the Sidekick doesn't run Windows Mobile.

Further readingTodd Bishop's Microsoft Blog: Windows Mobile misses annual shipment targetMicrosoft: Press Release

Microsoft calls out for more IE8 beta testers

Internet Explorer 8 Beta 2 will be aimed at the end-user (the first beta was aimed at the web developer) and, while all its features haven't been disclosed yet, we already know about some expected reliability and performance improvements. The next beta version is scheduled to arrive sometime next month and, while it will be a public beta release just like Beta 1 was in March, Microsoft is asking for more quality testers for its beta program. HangZhou Night Net

Anyone can currently discuss the betas on the IE8 newsgroup (monitored by Microsoft MVPs and IE team members), and can vote on IE8 bugs reported by testers via Microsoft Connect. However, the company feels this isn't enough, since the only direct way to file a bug report is to be an official tester. So the IE development team is looking for more users willing and dedicated to improving the next version of the world's most popular browser. Microsoft is asking anyone interested in filing bug reports to e-mail the team "a little about yourself including why you'd be a great beta tester."

Usually the company relies on surveys or beta invites, but it appears that the software giant is only looking for truly dedicated testers in this beta program. Microsoft doesn't have much time (about four months) after the release of Beta 2 in August if the company wants to get the final version of IE8 out the door by the end of the year. There has not yet been a mention of a Beta 3 but, given the timeframe Microsoft has and the fact that RCs still need to be released, even assuming that there will be no delays, the release of yet another beta is unlikely.

Further reading:IEBlog: Wanted: IE8 Beta Testers

New 3.2Gbps FireWire spec approved, not as fast as USB 3.0

The Institute of Electrical and Electronics Engineers (IEEE) 1394 working group behind the development of FireWire in both its 400Mbps and 800Mbps configurations has formally approved the next-generation S1600 and S3200 standards. These two standards build on the already established FireWire ecosystem, and will offer speeds of 1.6Gbps and 3.2GBps, respectively. The final specification itself should be published in October, but there's no word on when we'll see shipping product, or what the adoption ratio between S1600 and S3200 will be. HangZhou Night Net

Backwards compatibility concerns, thankfully, should be kept to a minimum. The new S1600/S3200 cables will be fully compatible with both older FireWire 800 cables and FireWire 400/800 devices. S3200, meanwhile, isn't the end of the line for FireWire technology, as current plans call for the interface to scale up to at least 6.4Gb/s over time. That's not going to happen any time soon, but there's obviously still plenty of headroom in the interface itself.

The IEEE 1394 standard will face a new competitor in the form of USB 3.0. USB 3.0's specification is expected to be published by the end of the year, which may give S3200 a few months' head start. FireWire, however, has never enjoyed the widespread success of USB 3.0, and as a result, could find itself the first standard out the door, but the last standard on the shelf. Motherboard manufacturers will drop USB 3.0 on high-end boards as soon as chipsets are available (even if devices aren't), but FireWire ports are considerably harder to come by.

That's not to say they don't exist, but FireWire 400 is easier to find than FireWire 800 (except on Macs), and the number of available ports is typically limited to 1-2, even on a high-end motherboard. USB 2.0 ports, on the other hand, are plentiful, with most boards offering 8-12 in some combination of included ports and onboard headers. The peripheral interconnect field is also more crowded now, and S1600/S3200 will have to compete against eSATA, as well.

Daring to mention USB 2.0's dominance over FireWire inevitably brings the standard's defenders out of the woodwork, and to be fair, FireWire has always been the more technologically-advanced standard, with its faster transfer speeds, lower CPU utilization, and the ability to provide more power to attached devices (devices that can run off a single FireWire port could well require two USB ports). These advantages, however, have never managed to overcome USB 2.0's general popularity, and FireWire remains a niche interface outside certain peripheral markets (i.e., video cameras), where it has always done well, and Macintosh computers.

Broad market penetration notwithstanding, the appearance of a faster FireWire standard will be warmly greeted by anyone frustrated by FireWire 800 transfer speeds who doesn't want to deal with the potential hassles of USB 3.0.

Microsoft: number of 64-bit Vista PCs doubled in three months

Unlike Apple, Microsoft does not control the hardware that its software runs on. This means that Apple can more easily move all its users to an x64 operating system: all Macs currently have 64-bit CPUs, and Snow Leopard is rumored to be a 64-bit-only release. Windows 7, on the other hand, will still be released in x86 and x64. Microsoft would prefer not to make Windows 7 available on computers with 32-bit CPUs (indeed, Windows 7 Server will be x64-only), but the decision is driven by software compatibility demands. HangZhou Night Net

Many businesses still use 16-bit applications, which cannot run on a 64-bit operating system, or have 32-bit applications that for one reason or another don't run properly in an x64 environment. Few software developers offer x64 programs, and Microsoft doesn't want to hurry them up; the software giant wants the industry to make the move by itself. Apparently, this has already started. On the Windows Vista Team Blog, Microsoft has posted details of how the PC industry is moving from 32-bit to 64-bit PCs:

We've been tracking the change by looking at the percentage of 64bit PCs connecting to Windows Update, and have seen a dramatic increase in recent months. The installed base of 64bit Windows Vista PCs, as a percentage of all Windows Vista systems, has more than tripled in the US in the last three months, while worldwide adoption has more than doubled during the same period. Another view shows that 20 percent of new Windows Vista PCs in the US connecting to Windows Update in June were 64bit PCs, up from just 3 percent in March. Put more simply, usage of 64bit Windows Vista is growing much more rapidly than 32bit.

This rapid growth may appear to have come from nowhere, but on closer inspection, it hasn’t. The major advantages of running a 64-bit installation of Vista is the ability to use 4GB or more of RAM and to use 64-bit applications. Although 32-bit operating systems can use more than 4GB of memory, for compatibility reasons MS limits 32-bit desktop OSes to 4GB. Prices for RAM have fallen, however, and OEMs are offering PCs with 4GB of RAM and more, forcing the switch to 64-bit Windows. The realization that Vista x64 has significantly superior compatibility to XP x64 is also starting to sink in. In Vista, 64-bit drivers are required for WHQL certification, and so many hardware manufacturers that were previously ignoring x64 have finally started to release x64 drivers.

Microsoft is also playing its part in the move to 64-bit. In addition to its 64-bit ready webpage, the company recently launched the Windows Vista Compatibility Center into beta phase, which will tell users whether a given product is 64-bit-compatible or not. x64 is clearly the future. If Redmond does indeed follow through with its decision to offer x86 and x64 versions of Windows 7 (and there's no indication that it won't), the company should at least make the effort to get OEMs to offer x64 by default on systems that can run it.

Further readingWindows Vista Team Blog: Windows Vista 64-bit Today

Fuel cell improvements raise hopes for clean, cheap energy

With pressure mounting to transition away from fossil fuels, fuel cell research has grown significantly in the last several years. In the simplest sense, fuel cells are a battery that you refuel slowly, regulating a chemical reaction and harvesting that energy in the form of usable electrical current. Current solutions use exotic materials to regulate the reactions and often require fossil fuels to generate the chemicals, defeating the purpose of the exercise. Today's release from Science includes three articles that detail methods that may help us overcome the problems with current-generation fuel cells. HangZhou Night Net

Cheap catalyst splits water

Widespread use of fuel cells will rely on cheap sources of hydrogen and oxygen. Researchers at MIT have now made an oxygen-producing catalyst that operates on water in a neutral environment (pH 7 at atmospheric pressure) and can be coupled with solar cells; it's essentially a man-made equivalent to photosynthesis.

Platinum has been used as a catalyst for this reaction in the past, but the costs associated with platinum (it closed today at over $1,730 per ounce) have prompted efforts to eliminate its use. The new research describes the formation of a catalyst composed of a combination of cobalt, potassium, and phosphorous—all cheap and easy to obtain. The researchers found that two different inert electrodes would, when placed into a dilute solution containing cobalt and buffered with potassium phosphate, spontaneously form a coating of the catalyst. When provided with relatively low electrical potentials, such as those obtained from a solar cell, the catalyst would liberate oxygen gas by splitting the water that was acting as a solvent.

The key breakthroughs here are the elimination of precious metals from the catalyst, the in situ formation of the catalyst, and the benign operating conditions of the reaction. All of this adds up to big cost savings in splitting water into is component gasses. Platinum's cost is all too apparent to anybody that has ever been to a jewelry store, but less apparent is the costs associated with producing catalyst materials, a process all but eliminated in this research.

Using less metal

Another use of platinum may go to the wayside in favor of an organic alternative, courtesy of Australian researchers. The metal is often used as a cathode that forms the interface between air and an electrolyte, used in both fuel cell and air/metal battery applications. This electrode's job is to reduce oxygen from the air and diffuse it into the electrolyte, so that it can be put to work in further chemical reactions that generate electricity. Here, platinum has issues beyond its exorbitant cost. It suffers from inactivity in the presence of carbon monoxide gas and diffusion of the platinum particles in the carbon substrate to form agglomerates that harm performance.

Electrically conductive polymers have been tested, but the performance simply wasn't sufficient to justify replacing platinum. But developments in gas-phase deposition techniques have now allowed for higher-quality electrically conductive thin-film polymers to be produced, opening the door for fuel cell applications. In this case, the researchers focused on a conductive polymer called poly(3,4-ethylenedioxythiophene), or PEDOT. The need for both a high surface area in contact with the incoming gas and to avoid moisture ingression led scientists to coat the PEDOT on every hiker's best friend: Gortex fabric. To further enhance conductivity, a 40nm gold coating was added.

The PEDOT electrode is homogeneous, eliminating catalyst agglomerations that plagued the long-term reliability of platinum based electrodes. It's also insensitive to carbon monoxide poisoning, another performance-robbing problem with platinum. The optimal PEDOT coating thickness was found to be 400nm, and performance was on par with that of the standard platinum-based electrodes. Researchers ran the electrode for 1,500 hours with no loss in performance. With the cost of the platinum in a fuel cell being equal to the total cost of an equivalent gasoline engine, this breakthrough has huge potential to drive down the cost of fuel cells, although researchers were quick to point out that similar breakthroughs are needed to get rid of platinum at the anode side.

Solid oxides get to chill

Solid oxide fuel cells (SOFCs) represent a completely different approach to the problem. They're one of the leading options because, compared to many other green technologies, they have relatively high efficiency, high energy storage density, and produce only water as a byproduct. While SOFCs have not made substantial in-roads in the consumer space, they are being adopted as emergency power systems for hospitals, 911 dispatch centers, and other critical entities.

The primary limitation of SOFCs is high operating temperature. SOFCs operate by diffusing O2- across a ceramic electrolyte. Current generation systems use Y2O3 doped ZrO2 (YSZ) electrolytes that require operating temperatures above 700°C because the diffusion is a thermally activated process. A variety of alternatives to YSZ have been suggested, but they offer only modest improvements in operating temperature. In this week's Science, researchers from Madrid and Oak Ridge National Lab describe a novel SOFC membrane that operates at room temperature.

In these ceramics, solid state diffusion of the oxygen can be thought of as occurring through a series of atomic jumps, where ions leap from one lattice site to the next provided the next site is vacant. The easiest way to increase ionic conduction is to increase the number of vacancies—raising the temperature is typically the easiest way to do that. This temperature effect gives rise to the high operating temperatures in conventional SOFCs. The materials in this study are unique because they stabilize incredibly high fractions of vacancies at room temperature.

Instead of using monolithic YSZ, the authors used thin-film growth techniques (molecular beam epitaxy) to grow 5-60 nanometer thick, alternating layers of YSZ and SrTiO3 (STO). They found that these two materials form an interface where the anions (O2-) become highly disordered, causing an anomalously high numbers of vacancies. These unique interfaces form a superhighway of O2- conduction.

Electrical measurements showed that the primary conduction pathway in the materials went through the YSZ/STO interface, but the YSZ layers showed some conduction as well. This work conclusively shows that the conductivity is thermally activated and thus is a result of ionic motion, rather than charge migration. This data is incredibly important because previous reports of high ionic conductivity ultimately turned out to be a result of electronic conduction through defective membranes, making the materials useless as fuel cells.

Despite the substantial promise of these materials, it is probably premature to start placing orders for your room-temperature SOFC; drawbacks include processing that is not amenable to mass production, fast conduction in only two dimensions, and a lack of long-term stability information. Despite these concerns, this work is likely to represent a major step in the march towards wider SOFC commercialization.

The same general note of caution applies to all of these developments, as it's possible that some of these techniques won't scale, or will only find a home in some specific applications. Still, they highlight how focused research and development can produce significant improvements in clean energy technology.

Nobel Intent writers Todd Morton and Adam Stevenson produced this report.

Sciencexpress, 2008. DOI: 10.1126/science/1162018
Science, 2008. DOI: 10.1126/science.1159267
Science, 2008 DOI: 10.1126/science.1156393

Interview: PA alum discusses A40 audio for competitive gaming

Josh LaTendresse has a long and storied background in the gaming and audio world. He worked at Monster before coauthoring Gaming Hacks from O'Reilly Books, and he wrote a popular column about the world of audio-visual products and wiring for a little web comic called Penny Arcade. Now he's working for ASTRO Gaming, a company that wants to redesign your gaming peripherals. HangZhou Night Net

Its first product is the $250 A40 Audio System. A combination headset and "MixAmp" that allows you to control your gaming volume and voice chat independently. A40s can also be linked together to create a lag-free audio environment for competitive-level play. Also, the headset is comfortable as hell. LaTendresse was nice enough to give us some of his time to talk about this work, and the A40.

Okay, tell us your background, and how you got into product design.

I'm an unapologetic techno-junkie gamer, home theateraficionado, and gadget freak who cut his teeth during the heyday of the arcade and personal computer revolution. My first computer used floppy discs that were actually, well…floppy and my first gaming console experience involvedwood grain. If you are reading this and nodding your head right now, then you know just how few useful products have been created specifically for our subculture outside of a pure, pragmatic hardware standpoint. Where do you put all of your expensive technology? How do you interface with it? Where do you store, organize, and display it?

These are the questions I asked myself a while ago, and being the
pragmatic, handy guy that I am, I started designing and building a few
things that made my life easier and better. But this only opened a big
can of worms and led to more and more ideas which I knew were way
beyond the small-scale thinking that I'd done thus far. It was around
this time that I got connected with ASTRO Studios—which was just
then finishing up the design of the Microsoft Xbox 360. I had a meeting
with the 360 designers and the CEO of ASTRO, Brett Lovelady—and
literally gushed out about 20 ideas for things I thought that gamers
couldn't live without. A couple of these ideas became the seeds for a
company that we started called ASTRO Gaming.
Explain how you became involved with
Penny Arcade. Did that connection help you get people interested in
your more recent exploits?You also used to work for Monster Cable, correct?

My time at Monster Cable and with Penny Arcade is actually pretty
closely related. I'd been a fan of Penny Arcade since nearly the very
beginning, and one day I came into my job at Monster Cable and saw this
on the PA site:http://www.penny-arcade.com/comic/2002/11/25/. The Monster IT guys thought we were suffering a DDoS attack—and I had to explain to them that we were only being wanged. I
dropped a note to Mike "Gabe" Krahulik explaining that I was a fan and
loved the lampoon of Monster. He quickly called me back to say that
since they'd put up the comic he was buried in mail from gamers who
really didn't understand how to best wire and set up their home theater
systems, and wondered if I could help him out.Thus, my column called "Hook Up" was born, and I ended up writing
quite a few articles for PA over the next few years (and I think a
couple of people actually read them). This was really the seminal shift
in my career from consumer electronics into gaming. I began with
writing for PA, and that eventually led to helping with Gaming Hacks published by O'Reilly, to being a full-time pseudo-journalist for GamesRadar.com and PC Gamer.I think that if there is something interesting about my career, it is that I came in through the very, very, back, back
door. If you think that you want a career in gaming, get your education
and then find a toehold wherever you can. After that: network, network,
network. Every single person you meet is important, no matter how
junior they may seem today. Get to know every secretary, intern and
staffer that you can—in five or 10 years, these will be the
managers, producers and directors at companies you'll want to work for
or partner with.

Walk us through what the A40 Audio System is, and what you're hoping it's going to do for gaming. Who is your audience?We've designed the A40 Audio System to be a solution for
both professional gamers and hardcore enthusiasts. In a nutshell, the
A40 Audio System combines Dolby Digital/Dolby Headphone
surroundtechnologywith whatever voice communication system you prefer—be it Xbox Live, Skype, or Ventrilo-based VoIP. Furthermore, two or
more A40 MixAmps can be daisy-chained to activate the embedded,
high-quality, zero-lag voice communication network. And you'll just
have to hear it to believe how good the quality is.

What we hoped to do for gaming is really already happening. We've
been working with the premier gaming league, the Major League Gaming
Pro Circuit, for over two years in order to directly address the needs
of the world's most demanding gamers while developing our product. Our
prototypes worked so well last year that they were actually banned
after a big win, which is lifted now that the retail product has been
released for all teams to freely use.
But we are already back on top—just this past tournament at MLG
Orlando we watched (and cheered!) as every single first place team used
the A40 Audio System. So we really feel our products give players an
advantage that is much more than hype, and enable players to
communicate freely and play their best game possible.
For someone who once sneered at people who describe their TV
size in inches, using the A40 on your gaming consoles requires many
cables, and won't work if you're using an HDMI connection. Have you
thought of making a headset-only wireless version?

I see that you were one of the 17 people that read my old column!
It's nice to see that we both have less time on our hands these days.

Seriously though, the A40 Audio System was designed to fit the
needs of the very demanding tournament environment, and today this
means 'wired.' Tomorrow's technology may enable us to cut the cord, but
we'd need to have hundreds, if not thousands of headsets operating in a
very confined space and play nice with each other—and that
technology hasn't yet reached the consumer market.Most devices that attach to an HDMI connector also have an output
for digital audio after the video is stripped out. This is where you
should be attaching your digital audio cable. Connecting digitally is a
real benefit for the A40 Audio System—the noise floor just drops to
zero compared to most of the junky analog connections you see on most
devices today.

We spoke at PAX last year, and you seemed like someone who takes
design and quality very seriously with your personal rig. So dish: what
peripheral or piece of hardware lately has made you gag a little?
What's the last peripheral for gaming you've bought where you were
impressed?

I'm a little obsessed, and I'll be the first to admit my
personal space is a bit out of control. But ever since ASTRO became
involved with the design of the HP Blackbird (ASTRO Studios was the
design consultancy of the Blackbird project), its been really hard to
walk into the PC section of Fry's and not feel physically ill at the
sight of the utterly horrible state of the PC tower industry. Since I
started building my own PCs, I've always gone the rack-mounted server
route for my rigs, rather than wade into a cheap plastic and
tastelessly lighted children's computer. ASTRO designed all three
generations of the Alienware PC towers and laptops, so we are equally
guilty—and obviously they are not my cup of tea, either.

But the HP Blackbird went in a completely different direction and is absurdly
well-designed and built. Not only in form, but in specific function—slot loading drives, HDD docking bay, simple access and tool-less
upgradeability, in addition to an extra dimension in cooling due to its
lifted chassis. And what a chassis!

Do you ever wonder why we don't see more high-quality
peripherals for gaming on the consoles? It seems like there are things
like the official controllers, and then cheap knock-offs. Things are
slightly better for the PC, but the A40 is one of the more notable
pieces of high-end equipment dedicated to gaming I've seen in a while.

Many of us have grown up with our Atari, Nintendo,
Coleco, Sega, and Sony game systems that were—rightfully at the time—very oriented towards the children's toy marketplace. But as we've
grown up, much of the hardware, and nearly all of the accessories
created for gamers, hasn't. On the flip side, there are many
products that are oriented toward the older generation of gamers that
come from "hopelessly corporate," office-productivity companies.
Despite being higher quality, their products are extremely
conservative, and don't embrace the fun irreverence and true identity
of the gamer subculture — not to mention the specific features and
functionality that fit our needs.
So now that the A40 is shipping, what's next on your plate?

ASTRO Gaming will stand astride the lack of products
that I've previously mentioned. I can't go into detail about what we
have in store next, but rest assured, we aren't a "Headset company," or
even an audio-centric one. ASTRO is a high-end gaming equipment and
accessories company.

As I'm sure your readers will agree: gaming is cool, incredibly
social, and anunmistakablyvital component of today's digital
lifestyle—and it's about time we had access to products that
reflected that.

Canada seeks industry, not consumer input on secret treaty

ACTA, the Anti-Counterfeiting Trade Agreement being negotiated by wealthy nations, continues to make headlines due the secrecy surrounding its drafting. Despite the fact that the agreement may include provisions like ISP filtering that are likely to affect huge numbers of people, no draft of the treaty is available. What's worse, there appears to be a worrying trend among governments to consult early and often with copyright holders and only later let the public in on the action. HangZhou Night Net

That appears to be especially true in Canada, where law professor Michael Geist found that the government had put together a group of "insiders" to advise on the treaty. Included, of course, were representatives of the recording, video game, and movie businesses; not included were privacy representatives, NGOs, or digital society groups.

Geist revealed the existence of the group in a Toronto Star column this week, based on documents he received under Canada's "Access to Information Act" (similar to the US Freedom of Information Act).

According to his information, the "Intellectual Property and Trade Advisory Group" was planned to include 12 government departments and 14 industry groups, and all would be a part of "in-depth exchanges on technical negotiating issues." In other words, they would be brought in to hash out the nitty-gritty of the treaty, which means they would direct input into its formation and access to the negotiators.

It makes sense to invite affected parties in to consult on legislation; they certainly know their business better than the government and are in the best position to understand the effects of legislation. But that means all affected parties. At least the US, while still releasing few details about the fast-tracked negotiations, has solicited public comment and has made those comments publicly available. That's how we know, for instance, that the MPAA favors jamming some kind of "three strikes and you're off the Internet" law into the agreement, while the RIAA wants to criminalize even noncommercial piracy.

ACTA negotiations are going on today, in fact, at an important three-day Washington meeting that wrapped up today. IP Justice, an NGO that deals with such issues, obtained a leaked memo (PDF) that "concerned business groups" submitted to "ACTA negotiators." It's nothing particularly sinister, and doesn't appear concerned with the Internet issues that have so interested some copyright holders, but it shows just how organized the business lobby is. (A previously leaked RIAA memo showed that Big Content has been leaning on negotiators for months, too.

Such important changes to public policy need robust public debate; let's hope Canadians, Americans, Europeans, and everyone else gets some before the treaty is signed.

OSS voices must be heard in national security debate

At the OSCON open-source software convention last week, the Foresight Institute's Christine Peterson—the individual credited with conceiving the term "open source"—urged technology enthusiasts to help redefine the way that society responds to security threats. The stakes are high, she claims, and the cost of failing to act could be enormous. HangZhou Night Net

She began her presentation by discussing the multitude of serious problems that have emerged from the adoption of electronic voting machines in the United States. Although electronic voting was originally devised to simplify elections and increase the accuracy of ballot tabulation, the voting machines in use today are disastrously unreliable and insecure. The hardware failures and demonstrable susceptibility to tampering exhibited by these devices is undermining the transparency and credibility of American democracy.

Many technologists—including those of us here at Ars—anticipated these problems, but were unable to elevate the issue into mainstream awareness before disaster struck. The flood of research, security analysis, and pragmatic criticism came too late to stop widespread adoption of deeply-flawed election technology. This mistake, Peterson argued, is not one that we can afford to repeat. In the face of defective technologies that threaten to undermine basic liberties, the open source software community "should rise up like an allergic reaction," she said.

Resistance, however, is not enough. In order to overcome such challenges, technology enthusiasts must find better ways to address the underlying problems that seemingly necessitate the faulty solutions. According to Peterson, The area where there is the greatest need for action is in national security. The federal government's controversial use of secret surveillance raises serious questions and poses a very real threat to privacy. She believes that the government has adopted this risky top-down approach to security because it lacks the tools it needs to address the problem in a more responsible way.

Christine Peterson

Peterson argues that the legitimacy of the threat posed by terrorism today is widely disputed, but she contends that very real and infinitely more deadly security threats could await us in the future. The rapid advancement of technology opens the door for a whole new class of weapons of mass destruction. She warns that the terrorists of the future might employ bioweapons, nanotechnology, and chemical warfare to bring about death and destruction on an unprecedented scale. "The threat is real, to some extent. If it's not real today, it will be real someday," she stated.

The government, with its extreme dependence on ineffectual top-down surveillance, will be ill-equipped to address these news threats. The result could be an even greater loss of privacy as surveillance becomes more intrusive. Additionally, Peterson also noted the inevitably high price we pay when surveillance fails and inaccurate facts are used to direct policy. One example she cites is an incident from 1998 where the United States government mistakenly bombed a pharmaceutical plant that was making medicine in Sudan because faulty intelligence erroneously indicated that it was producing chemical weapons.

"They don't have a big tool set—they want surveillance. Here in our community, we debate these things. [The government] doesn't give a damn about our debates, they just go ahead with their plans," she said. "Not only does [the top-down approach] lead to a surveillance state, but it doesn't work."

Instead of using secret spying, "we need to track the problem, not the people." The best way to combat the problem is to redefine the solution space. The answer is to drive innovation and deliver new technologies that can guarantee both privacy and security. Tools must be built that can detect security threats while also imposing verifiable limitations on government intrusion. In order to prevent abuse, these tools must be utterly transparent and perpetually subjected to the highest level of public scrutiny. Her mantra is "no secret software for public sensing data."

The people who will build such tools, she insists, need to have a deep understanding of security, privacy, functionality, and freedom. She is completely convinced that the open-source software community has the values and expertise needed to lead the way.

She hopes to take the first steps towards achieving these goals at the Foresight Vision Weekend, an informal conference that will take place in Silicon Valley in November.

Hands on: Delicious 2 cleans up social bookmarking

Popular social bookmarking service del.icio.us has finally unveiled one of its most significant and anticipated redesigns since launching in 2003. Ars Technica went hands on to see what Yahoo has in store for the next generation of social bookmarking. HangZhou Night Net

As a quick primer for those who haven't hopped on the bandwagon yet: social bookmarking websites allow you to save URLs in "the cloud" instead of in a single browser. Bookmarks can be tagged with multiple keywords for easy categorization and recall, which offers a number of benefits. First, sites like these are great resources for watching what other human beings (not automated search engines) are interested in. Second, by harnessing the many integrated tools for both saving and retrieving bookmarks, you can harness these social filtering tools as a way to liberate your web browsing habits since your bookmarks are no longer locked up in a single browser or even computer.

One of the most subtle, yet important, changes in the new Delicious (called "delicious 2" by everyone but its developers) is the loss of two periods from its name. The URL used for the service until now has always been a clever play on domain names: del.icio.us. Now that URL, and all links to bookmarks saved at it, redirect to the much friendlier delicious.com.

Explained in an announcement blog post, the new Delicious focuses on three fundamentals: speed, search, and design. Browsing through the site, that first criterion has clearly been met. Yahoo says Delicious has over 5 million users now, and as a regular user of the site for around three years, I've noticed the gradual slowdown that Delicious acknowledges. That creeping sluggishness is nowhere to be found anymore, though, and clicking through tags from both my own bookmarks and across the site's community is really zippy.

Next on Delicious' menu is the all-important search, which again focused on speed but also utility. I no longer have time to make a pot of coffee while waiting for results, and Delicious' search is both more accurate and social. A search can be directed at one's own tags, a single user's public bookmarks, the bookmarks from one's social network, or of course, the entirety of Delicious. Either way, searches are lightning quick, though I'm disappointed to see the default search option point at the entirety of Delicious, instead of my own bookmarks. Perhaps the Delicious crew has real user statistics to prove that this is the better choice, but I prefer rooting through my own maze of bookmarks and tags before embarking out through the rest of Delicious. At the least, I would like a feature to customize this default setting.

As a brief side note, I'm very glad to see that the Delicious crew maintained composure when it comes to the social networking aspects of the site. The ability to add friends and send links to other users is definitely a value of the service. But in the back of my mind I always worried that this new version would bring a lot of ridiculous cruft like extensive user profiles or, heck, even minigames like "Link that beer" or "URL Scrabble." It's refreshing to see that the site maintained just the bare necessities of a "network" list of friends, as well as "fans" who are interested in what you're bookmarking.

Easily the most significant change to the new Delicious is the team's third criteria: design. Yahoo says that improving usability and adding a handful of frequently requested features were top priorities, and it shows. Delicious' UI is much cleaner now, with bookmark metadata like dates and tags getting cleaned up with more relevant placement and visual markup. A new colorized counter accompanies each bookmark to let the user know how many others have saved the same URL, and the light blue color gets darker and heavier as that number increases.

A handful of new tools have been added to the site to make it easier to navigate bookmarks and hone in on a specific tag. In the top left of each bookmark list, for example, are three view buttons that allow for adjusting how much metadata is displayed with each bookmark (tags, URL, etc.). Bookmarks can now be sorted alphabetically in addition to chronologically, and a "Top 10 Tags" widget in the right sidebar of each user page offers a quick glimpse at what kinds of bookmarks make him or her tick. Lots of smaller bits of polish sprinkled throughout make the new design a joy to explore. Check out a clever video the Delicious team put together that does a great job of highlighting the feature evolution this update brings:

With all of these welcome changes, though, Delicious still suffers from some rough edges and bizarre stubbornness. One previously existing feature that allows for automatic blogging of the day's bookmarks, for example, is still listed as "experimental" and is fairly clunky to set up.

On a grander scale, Delicious is sticking with its single-word philosophy for tagging bookmarks on the service, instead of adopting the far more useful comma-separated method that most other sites have agreed on. This means that if you add 'Mac OS X' as a tag to a bookmark, you've actually added three tags: "Mac," "OS," and "X." This needlessly restrictive limitation can make the process of building a tag hierarchy frustrating, especially for new users.

That said, the new Delicious is largely a success. The new design is far easier to navigate and provides a lot of useful ways to visualize bookmarks from one's own collection and across the site. The speed brings a refreshing boost to performance for those who frequent the site, and some limited testing shows that existing clients for posting and retrieving bookmarks still work perfectly well. The new Delicious may not bring a revolutionary change to social bookmarking, but the significant changes are a very welcome evolution.