Report: Google plans venture capital group, but why?

Due to its success in the online ad and search market, Google has amassed nearly $13 billion in cash. So far, this hoard has mostly been used for the purchase of startup companies that are eventually assimilated into the Google Borg. But the search giant has also set up a foundation, Google.org, that (among other activities) invests in companies that are pursuing goals that the Google founders deem worthwhile. Apparently, the company has liked the investment approach enough that it’s now considering creating an in-house venture capital effort. HangZhou Night Net

The Wall Street Journal is reporting that planning for a venture capital group is already under way. The group will apparently be run by Google’s chief legal officer, David Drummond, who apparently has time to kill when he’s not testifying before Congress. The Journal reports that a former entrepreneur and private investor named William Maris will be brought on board for his expertise.

It’s not entirely clear what Google hopes to accomplish through an investment arm. Although the economy is a major worry at the moment, there has been little indication that this has had a damaging impact on the venture capital markets. As such, it’s not clear that there is a desperate need for Google’s billions to float new ventures. The converse, of course, is that the investments might pay off for Google, but the company doesn’t seem to need help in that regard, either.

One alternate explanation is that Google seeks to influence the development of new markets and technologies through its choice of investments. It’s possible that they’ll structure the investments such that they have the option of buying the company outright should things work out.

This approach would entail a number of risks, however. Google is already being accused of having a monopoly on search by its competitors, and investments that seek to influence the development of this market would inevitably end up presented as evidence of monopolistic abuses. Meanwhile, startups might be leery of accepting investments from Google if they felt there were strings attached. Those behind new ventures will want to know that they can pursue anything that makes sense financially and technologically, rather than feeling they are restricted to chasing only those markets that Google thinks are appropriate or working with an eventual buyout by the search giant in mind.

The Journal notes that Google would join a significant list of technology companies, including Intel and Motorola, should it open an in-house investment group. It describes the experience of these other companies as mixed, and notes that their investments account for a shrinking slice of the venture capital pie. All of this makes the decision by Google that much more puzzling, as the company tends to avoid entering shrinking markets.

Street Fighter vs. Mortal Kombat: still trash-talking

Since the early 1990s, the Mortal Kombat and Street Fighter franchises have maintained a strong rivalry. It all began back when Kombat creator Ed Boon claimed that his game would "kick [Street Fighter]'s ass,"… a statement he continues to make to this day. Capcom, naturally, never took kindly to such words, and accused the gorier fighting game of simply imitating the greatness of its Street Fighter franchise. HangZhou Night Net

At last week's Comic Con, Eurogamer discovered that the rivalry is still going, and it seems to be just as powerful as ever thanks to the fact that both Mortal Kombat vs. DC Universe and Street Fighter IV are going to be hitting stores in the near future. When Johnny Minkley tracked down Ed Boon at the show, he asked if the developer still felt like he was competing with the Street Fighter games. "I think just from the history, we do," Boon said, "I think we'll kick their ass…. we're a wilder ride—a big rollercoaster ride—and they're a little bit tamer."

Meanwhile, at Capcom's booth, community manager Seth Killian was just as impassioned. "You can't touch the mechanics of Street Fighter," he said, "and [SFIV] is really channeling back the classic mechanics that ignited the world… Mortal Kombat was riding the coattails of Street Fighter then, and I think Mortal Kombat may be riding the tails of Street Fighter as we move into 2009."

At this point, the rivalry is starting to seem a bit absurd, as the two games have evolved into largely different animals. The Street Fighter core games have always felt faster-paced and maintained an anime feel thanks to its art style and 2D layout. The Mortal Kombat games, on the other hand, feel a little slower in terms of combat but feature showier moves and have often featured weapon mechanics, 3D environments, and strangely enthralling amounts of gore.

Maybe it's time to leave this competition behind and find something else to focus on.

Opt-in or opt-out? Street View case echoes privacy debate

The UK today cleared Google's Street View service for use after looking into the privacy implications of the program, but the panoramic pics of real-world streets and homes continue to generate controversy. Google's response to critics? Get over yourselves. HangZhou Night Net

A Pittsburgh couple, Aaron and Christine Boring, sued Google earlier this year in federal court after their $163,000 home appeared on Street View; the shot of the home was allegedly taken from the couple's private road. How do we know the home cost $163,000? Because photos of the property, details of the sale, and even a rough floorplan are already publicly available on the Internet from the Allegheny County assessor (not linked to preserve whatever privacy the Borings have left).

Google's lawyers, not one to miss a trick, have already pointed this out the court. The couple claims injury "even though similar photos of their home were already publicly available on the Internet, and even though they drew exponentially greater attention to the images in question by filing and publicizing this suit while choosing not to remove the images of their property from the Street View service," says Google.

Privacy? What privacy?

This statement is made in service of Google's larger point, which is: no one today has complete privacy. Except, perhaps, hermits.

"Today's satellite-image technology means that even in today's desert," Google writes, "complete privacy does not exist. In any event, Plaintiffs live far from the desert and are far from hermits… In today's society people drive on our driveways and approach our homes for all sorts of reasons—to make deliveries, to sell merchandise and services door-to-door, to turn around. As a society, we accept these 'intrusions'. They are customary, even expected." (The Smoking Gun unearthed the filing yesterday, though there's nothing "smoking" about it; anyone with PACER access to federal court cases can search for case 2:08-cv-00694-ARH.)

The pictures in question were "unremarkable photos" and the alleged trespass was "trivial." At every point in its response, Google passes off the complaints over Street View as simply too petty too bother with—even as it admits that its driver may, in fact, have driven past a sign marked "private road" to take the photo.

You complain, we fix

The company's point about complete privacy is well taken; we don't have it. But Google's preferred solution to the problem is for people to use "the simple removal option Google affords." Sound familiar? Sure it does, because it's the exact same argument the company uses against all rightsholders: tell us where the problem is, we'll fix it, but don't ask us to be proactive about clearing permissions first.

This dispute is at the heart of the Viacom/YouTube $1 billion lawsuit, among other cases. In this instance, Google basically says that it's up to people to scan Street View themselves, pick out photos that might be private, then notify the company. Staying off of private roads isn't Google's problem; it's the homeowner's.That might sound burdensome, but it's the same argument deployed against rightsholders over video.

This fundamental tension between the opt-in/get-permission/check-first model and the opt-out/seek-forgiveness/fix-later approach is shaping up as a fundamental point of contention on the Internet. NebuAd's opt-out approach to grabbing ISP clickstream data has become such a big deal that Congress has already held multiple hearings on the matter and has ISPs across the country running scared. When it comes to copyright, rightsholders have pushed (with some success) for video-sharing sites to screen uploaded content for possible violations before it goes live. User-generated content sites, which have powered the Web 2.0 revolution, are under attack over uploads of child pornography, regular pornography, and clips of public harassment and abuse of others. And a UK government commission this week recommended that user-generated content sites be forced to screen all uploads with human eyes before pushing them out to the web.

Copyright, privacy, and school bullying videos might not seem to have much in common, but the debate over screening first vs. fixing later could reshape the Internet as we know it. Having to get permission or screen content would hobble useful services like Street View and YouTube, and it would probably put companies like NebuAd out of business, even as it might lead to less objectionable or private content online.

But Wikipedia, YouTube, Flickr, and other services have shown us that the great mass of Internet users can produce a volume of content that boggles the mind and overwhelms attempts at centralized corporate screening and control.

Which approach do we want, and for which services do we want it? The Boring case is one tiny piece of this much larger debate, a debate which is about as interesting—and important—as Internet debates can be.

Superluminal waves make a theoretical splash

One of the fundamental conclusions of special relativity is that information cannot travel faster than the speed of light. This, of course, means that physical objects cannot travel faster than the speed of light either. However, every few years, someone reports an experiment that in a paper that contains the phrase "superluminal velocity." What is up with these claims? Is there a chunk of the physics community out there hiding something from the rest of us? HangZhou Night Net

Sadly, the answer is far more mundane than that. What the researchers are usually reporting is called the group velocity, which is allowed to travel faster than light because it carries no information. However, this begs the question of why researchers are still interested in something so apparently mundane. The answer is quantum mechanics.

In quantum mechanics, there is a phenomena called tunneling that describes how a particle, trapped by a barrier, can escape through that barrier even though it doesn't have sufficient energy to do so. Tunneling is of fundamental importance because it provides a framework through which our understanding of radioactive decay and modern electronics (among other fields) is derived. On close inspection, however, tunneling raises a few questions as well; for instance, what is the speed of the particle as it passes through the barrier, and how long does the tunneling process take?

The theoretical picture tells us that the speed of a particle in the barrier is purely imaginary (as in imaginary numbers, not pink elephants), as is the transit-time, which makes interpretations problematic. However, by treating the particle as a wave and examining the phase change induced by the barrier, a real transit time can be derived. This transit time becomes independent of the barrier thickness for thick barriers, which has some interesting consequences. For a sufficiently thick barrier, the particle must move faster than the speed of light. Unfortunately, measurements of this phenomena have proven to be quite difficult, and no conclusions have been drawn from the initial results.

Luckily, there is an alternative way to test this: light. Under the right circumstances, light will also tunnel across a barrier—a phenomena with the catchy name frustrated total internal reflectance is one example—and an analogous calculation for light can be made. But light causes its own set of experimental problems. All of the potential experiments involve detecting very weak pulses of light in the presence of much stronger reference signals. There is also the issue that the weak pulse is derived from the stronger pulse—as the intensity of a pulse gets weaker, it is quite likely that the pulse width will get shorter. Depending on how you define your measurement protocol, you might measure faster or slower transit-times.

Clearly, what is needed is a way to create a tunneling barrier through which all the photons will tunnel. This is exactly what a group of Russian and Swedish researchers have proposed. In a recent Physical Review E publication, they have described the properties of a coaxial transmission line where a chunk in the center is modified to create a tunneling barrier for microwave light.

This is achieved by gradually varying the composition of the plastic center so that the light "sees" a weak "U" shaped potential. Under the right conditions, none of the microwaves will be reflected by this barrier, but the properties of the barrier and the mathematics describing it tell us that the light must tunnel. Furthermore, the nature of a coaxial cable means that every color of light travels at the same speed—unlike fiber optic cables. These properties provide nearly ideal conditions for an experiment.

To make the measurements, the researchers propose taking a microwave source and sending half the radiation through the tunneling barrier and half along a normal coaxial cable. The radiation can then be recombined so that an interference pattern is detected. By shooting pulses of light down the cables, the amount of overlap between two equally intense pulses can be measured, which can be used to derive how big the time difference between the two paths was. If the tunneling pulse arrives first, it traveled at a superluminal velocity.

So that all seems pretty simple, right? Where are the experimental results? Well, there aren't any. Being a naturally suspicious type of person, I wonder if this is because they have tried and found they couldn't get the coaxial cable to perform as expected. On the other hand, I would be delighted by robust experimental results on superluminal (or lack of) group velocities, and I hope they are planning a follow-up paper. However, what I really want to see is the phase velocity measured. The group velocity is measured by how fast a pulse moves, while the phase velocity is the velocity of the underlying waves that make up the pulse. Measuring the phase velocity across such a tunneling barrier would be very challenging, but would present a real test of the correctness of special relativity. A test I would expect it would pass with flying colors, just as it has with all the other tests.

Physical Review E, 2008, DOI: 10.1103/PhysRevE.78.016601

Fixing the structural ills of US biomedical research

Although the US biomedical research endeavor might appear to be in great shape, it faces some significant problems. The rapid doubling of the NIH budget started during the Clinton administration was followed by several years of flat funding, and this has exposed structural cracks in the way that we're training and employing researchers. Here at Ars we've been banging that drum for a while now, and today in Science, Michael Teitelbaum, vice president of the Sloan Foundation, joins in with a policy forum article on the topic that contributes a list of suggestions for policymakers. HangZhou Night Net

Starting in 1998, the NIH underwent a doubling of its budget over five years. At the time, this was greeted incredibly favorably. The numbers of successful grant applications rose, more PhDs were awarded, and more foreign scientists were attracted to the US. Along with this, a lot of money was spent by universities and research institutes across the country on shiny new buildings. Things were looking good.

However, since 2003, budgetary pressures have meant that, in real terms, the NIH budget has actually been falling. Since this is the major funding source for biomedical R&D in the US, a lot of researchers who were accustomed to steady growth suddenly found that funding opportunities weren't keeping pace with the growing research community. After spending five to eight years being trained to be independent scientists, all those new PhDs and postdocs suddenly discovered that only one in five would ever be able to land a faculty appointment. Those recently promoted to the ranks of faculty discovered that instead of being mentored by their senior colleagues, they were instead competing with, and often losing to, those same senior scientists for grants.

One of the problems with the system is that those young scientists, the PhD students and postdocs, do the bulk of the work; it's a lot cheaper to employ a PhD graduate as a postdoc (whom you often don't need to offer benefits to) than it is to hire a technician. However, postdocs are supposed to be temporary positions; after several years, that young scientist should be able to apply for independent funding.

That's the theory. In practice, advances in medical research (ironically) and the concomitant advances in working lifespan mean that the existing tenured faculty across the country are not retiring, so there are far fewer positions than scientists looking to fill them. Furthermore, since these postdocs are working for a PI upon whom they depend for their salary, the expectation is that they will work on the PI's project; the training towards independence isn't happening.

A lot of those postdocs—well, more than half—are foreign-born and mainly working on temporary visas, and those on J-1s, can often be paid far below the prevailing wage; many universities have stories of unscrupulous faculty who bring researchers over from China or India to work 15-hour days in the lab with the knowledge that those who complain can just be sent back and exchanged for another.

After describing this situation, Dr. Teitelbaum makes a number of recommendations in the article. One is to better align the PhD/postdoc systems with demand in the labor market for graduates; if only one in five is ever going to get a tenure track faculty position, we need to be better preparing the remaining 80 percent for the alternative careers. More funding for positions such as staff scientists could limit the constant stream of young scientists who end up forgotten after three or four years. A rethinking of the numbers of foreign scientists who are awarded temporary visas each year is also advocated.

Other suggestions include an examination of the way the NIH budget is constructed and how it is spent. The US should avoid rapid growth followed by years of fallow; measured, but less extreme growth is sustainable and avoids situations such as the one we currently find ourselves in, where a glut of young scientists has nowhere to go. Also mentioned is the idea of limiting the percentage of faculty salary that can be paid by research grants, and adjusting overhead rules regarding funding new facilities and the debt created by building them.

As someone who's been involved in the debate over many of these issues, the suggestions all seem like pretty smart ones to me, although there appears to be institutional resistance to change and inertia on anything other than a small level. The Pathway to Independence awards are a good example of this; designed to help transition postdocs to faculty, they are a good solution, but with only 250 given out each year and about 90,000 postdocs, they're mere window dressing, and the researchers who manage to win them were always going to get their own funding anyway.

While the NIH's role in creating or allaying these structural problems is often discussed, the role of universities also needs addressing. These institutions are the primary beneficiaries; they get a lot of the grant money through overhead costs, they don't have to pay much in the way of salaries, and they benefit from the publications. Universities are some of the first to lobby Congress to increase NIH funding, but don't appear to be taking their share of the responsibility for the problems.

For example, the universities could consider covering the salaries of their senior faculty. This would free up those experienced researchers from the need to continually compete with (and out-compete) up-and-coming scientists, and instead let them effectively mentor and train the next generations. Of course, as long as scientists continue to use grants as measurement of success against their peers, I'd expect a lot of resistance toward this idea. One thing's for sure, though, things ought not to continue along as they are now.

Science, 2008. DOI: 10.1126/science.1160272