Leave a comment

danieldwilliam October 13 2014, 11:54:20 UTC
Bespoke medicine is expensive.

I wonder how you could reduce the cost by an order of magnitude.

Reply

andrewducker October 13 2014, 12:01:32 UTC
Make it mainstream :-)

Seriously though, I suspect it will follow the standard curve for such things, and get cheaper over a decade or two as competitors constantly make the technology cheaper and faster.

Reply

danieldwilliam October 13 2014, 12:44:05 UTC
My reading of the article was that making the treatment mainstream was hard because it required to be made bespoke for each patient. It sounds like we are a way a ways from automating some of the processes required.

But it also sounds like there doesn't have to be much more than a halving of the cost before the treatment becomes close to cost parity with existing treatments.

(Which prompted the thought that I wonder if the frightening cost projections for NHS / health spending in years to come take account of the fact that many cancer treatments for example, come out of patent but presumably don't become any less effective.)

Reply

andrewducker October 13 2014, 14:13:47 UTC
Yes, but "making it bespoke" could still be automated. You have a scanner that reads in the cancer details, some code to work out the necessary gene change, and something to then produce the correct personalised drug.

Which could still take twenty years to automate - or we could have some breakthroughs in the next couple of years that bring the price down very quickly.

And yes, running out of patent will be great - Julie's Imatinib goes out of patent in 2015/16, and I expect prices will drop significantly.

Reply

danieldwilliam October 13 2014, 14:34:42 UTC
Yes - it's an example of the sort of knowledge work that AI like expert systems might be able to automate but also, yes, anytime between soon and next century for the details to be worked out.

Reply

andrewducker October 13 2014, 12:03:49 UTC
Oh, and Julie's Imatinib is around $30,000 per year. So she'd need to take it for 15 years for it to pay for itself.

Of course, the cost of the consultant checkups, blood tests, etc. are also a factor. As is her constant exhaustion and inability to be a generally productive member of society.

Reply

alitheapipkin October 13 2014, 13:04:19 UTC
The other consideration is that the research behind scientific advancements is getting more and more expensive - from the analysis machines now used to how we cope with the volume of data they create. The project I work on is funded to the tune of several million by the Wellcome Trust and all we do is give scientists tools to help make use of the data they can now generate.

Reply

andrewducker October 13 2014, 14:15:49 UTC
Software tools or hardware tools?

I'm curious as to whether more open source tooling is being produced for this kind of thing.

Reply

alitheapipkin October 13 2014, 14:24:43 UTC
Software tools. And yes, there is lots of open source stuff going on, a lot of which grew out of scientists who do coding as a hobby writing the stuff they need to help themselves. The problem becomes getting the labs to invest in the necessary hardware and tech support to run that software. Just because our software is free does not mean it doesn't cost labs money to run it and we are constantly coming up again particularly university research labs where they expect either their IT department (who are used to managing Windows desk tops) or a random postdoc to have the skills to run a complex server system and then moan at us when they can't.

Reply

andrewducker October 13 2014, 17:50:34 UTC
Oh yes - not planning for the cost of managing unusual or new systems is a common factor in things going Horrible Wrong.

Reply

alitheapipkin October 14 2014, 12:24:54 UTC
Unfortunately, research groups still seem to be in the mindset of getting funding for the expensive new analysis machine and someone to operate it but neglecting to think about what other tech support they require for their scientists to actually be able to work with the data they can now produce.

Reply

apostle_of_eris October 13 2014, 16:56:19 UTC
Look at the history of the Human Genome Project, and its cost estimates. It went from unimaginable billions to a desk-top unit for ten thousand in relatively few years. I think there's every reason to believe there will continue to be a "Moore's Law" situation with the relevant technologies.
(Remember the "digital divide"?)

Reply

andrewducker October 13 2014, 17:19:12 UTC
Absolutely. This stuff gets cheap terribly quickly. Give it a decade and this will be everywhere.

Reply

andrewducker October 13 2014, 17:21:43 UTC
Full genome sequencing for about $1000, apparently:
http://systems.illumina.com/systems/hiseq-x-sequencing-system.ilmn

Reply

alitheapipkin October 14 2014, 12:16:54 UTC
The infrastructure to process and store all the data the new sequencing machines produce is still being developed though; data handling has become the new limiting factor rather than producing the data in the first place. One of our work partners is developing tools for this at a research institute in Sardinia, it's really fascinating stuff but the technology required is hideously complicated, even the software devs in the team struggle to follow what they are doing.

Reply


Leave a comment

Up