This is the first blog post I've seen about the kick-off; the author seems to have left the event with a pretty pessimistic view of the prospects for industrial/academic collaboration. His main complaints is that academics don't understand games, and the specific needs of game developers. Well, then tell us! I would love to hear about specific problems in game development where evolution or some other form of machine learning or computational intelligence could matter.
Alex J. Champandard, in a comment on the same blog post, develops the point further. He asks:
So why do you need government funding for [applied games research]? It's a bit like admitting failure :-)
On the other hand, if [academics are] doing research for the sake of research, why do they need input from industry?
These questions can be asked for just about any research project in the interface between academia and industry. And yet companies happily keep funding PhD students postdocs, and even professors in a huge number of research fields, from medicinal chemistry to embedded systems design to bioinformatics. In some cases these collaborations/funding arrangements definitely seem strange, but apparently it makes economic sense to the companies involved.
I once asked an oil company executive (at a party! Now, stop bothering me about what sort of parties I go to...) why his company funds a professor of geology. His answer was roughly that it was good to have expert knowledge accessible somewhere close to you, so you know who to ask whenever you need to. Plus, a professor's salary wasn't really that much money in the grand scheme.
Now, game companies and oil companies are obviously very different sorts of creatures. I think the main opportunity for game companies would be to outsource some of their more speculative research - things that might that not be implementable any time in the near future, either because the computational power is not there yet, or because the technique in question would need to be perfected for a couple of years before deployment. Having a PhD student do this would be much more cost-efficient than assigning a regular employee to do it (especially with government funding, but probably also without), and frees up the employee for actual game development. In addition, the company's own developers might very well be too stuck in the way things currently work to try radically new ideas (of course, academics might also be stuck in old ways of thinking, but there are many academics around and if you offer some funding you can typically select which academic you want to work for you).
This argument assumes that game companies do any sort of research into technologies that lie more than one release cycle away. I'm not stupid enough to claim that no game companies do this - e.g. Nintendo obviously does - but I venture to guess there are many that don't.
As for the other part of Alex's question, "if we do research for the sake of research, why do we need input from industry?", the answer is more obvious. Because even if we do research because we love the subject itself and really want to find out e.g. how to best generalize from sparse reinforcements, we also want to work on something that matters! And fancy new algorithms look best together with relevant problems. It's that simple.
1 comment:
Fair point. :-)
I'm curious what you think about the fact that animation research doesn't need some kind of funded coordination network like that.
alexjc
AiGameDev.com
Post a Comment