So you want to do some biological experiments, but you don't own a lab. Currently, unlike many other technology sectors, it is not easy to exchange money for a wet-lab experiment. This post is my interpretation of the evolution and latest developments in outsourcing wet-lab experiments.
Virtual Biotechs and CROs
Virtual Biotechs have been around for many years, but are becoming increasingly common. Basically, a Virtual Biotech is a biotech that outsources its research, usually to a Contract Research Organization (CRO). Virtual Biotechs are small: usually between one and twenty people, and they may not even have office space. You can tell it's cool to be a Virtual Biotech when you see a trend piece in the WSJ. Here's a description of one Virtual Biotech and a recent success story from the excellent Life Science VC blog.
There are a number of ways a Virtual Biotech can work. For example: a pharma exec sees value in a compound that the pharma has given up on, so he spins it out, raises money to pay for a Phase I or Phase II trial (conducted by a CRO), and if that is successful, he sells it back to the pharma for a profit. Everybody wins.
In other words, Virtual Biotech is often about asset arbitrage rather than research, since developing drugs from scratch is too expensive and slow, and preclinical work does not generate value quickly enough.
Interestingly, huge pharma companies are increasingly acting like Virtual Biotechs, in that they are either using CROs to conduct primary research, or buying up biotechs that already developed a promising compound. Less and less R&D is being done in Big Pharma due to a lack of productivity (see Eroom's Law).
Figuring out which CRO to use still seems to be a trial-and-error process, more like hiring a consultant than buying some compute on Amazon.
Web 2.0 CRO
Academic labs, core labs, CROs and even biotech companies often have excess lab capacity that they want to utilize. About 5-10 years ago, Assay Depot and Science Exchange launched with the intention of connecting that excess capacity with academic and biotech labs. Assay Depot and Science Exchange act as clearing houses, connecting you to labs that can perform experiments for you. They provide contact information and billing services but not too much else.
On the positive side, this is often very cost-effective and a great way to leverage the expertise of (for example) the UC Davis Mouse Biology Program, but on the downside the service provider is not necessarily set up to act as a CRO, and you will probably end up having to contract with several labs to get all your experiments done. If there is a time-critical step spread over two labs, like RNA sequencing, you may have a problem.
I think the major use-case here is for an academic lab that lacks the ability to do a certain type of experiment. Instead of finessing a collaboration with another lab, you just pay a small amount and get your results back fast.
Here Come the Robots
Starting very recently, there is an exciting new trend in wet-lab outsourcing: robots!
Transcriptic is already up and running, with competitive pricing on cloning, genotyping and biobanking. I believe that Transcriptic will already perform many other types of experiment upon request, and that their advertised experiments are just their foot in the door. Emerald Cloud Lab will be launching in 2015 with a large suite of services. It is currently in beta.
The advantages of doing experiments through a lab like this are tantalizing: (a) it can make experiments cheaper and faster, with economies of scale and machines running all day and night (b) it can make your research more reproducible since your protocol will be defined by a machine-readable script (c) you can scale up your research from one sample to a huge number, potentially without changing the protocol of your successful pilot experiment
Interestingly, Emerald is using the Wolfram Language (very similar to Mathematica). I would prefer Python or something similar, but clearly the Wolfram Language has some great capabilities, and seems to be highly expressive for data-analysis (see the Wolfram Blog for some great examples).
We're clearly not yet in a world where companies and academic labs can just run all their experiments virtually (as we've seen analogously happen with AWS, Azure, etc for web companies) and of course there are many types of experiment that necessitate hands-on time and expertise (for example, developing a new protocol or technology).
However, there are also thousands of labs doing their own genotyping, cloning, sequencing, mass-spec etc. All of them are doing it slightly differently and half of them are doing it worse than the median (and none of them think so)... We know that science is generally not very reproducible at the best of times (see Amgen's experiments reproducing 53 landmark cancer papers or Ioannidis' famous paper) so reducing experimental variation must be a good thing long-term. I really hope to see more competitors to Transcriptic and Emerald soon — and maybe even new approaches to defining and publishing experiments in a reproduction-friendly machine-readable format.