Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One thing that seems to be forgotten or lost in all of these armchair analyses is that while it is true that something like 3mL of blood is collected for a conventional lab test, the machine itself when it runs the test only takes a few microliters (maybe 10s of uL, all of this is depending on the test of course). The rest is frozen for follow-up or repeat testing or used for other tests or just discarded after some time.

EDIT: And my favorite part is the first step the Siemens machines they used for testing is to "dilute" the part of the sample it is going to use (addition of DI water, to get the blood to the desired concentration). That is hard-coded into the protocol, with a specific dilution amount, and it is done for all the official Siemens tests, as well, hah.



According to an article in Nature[1]:

> Holmes described the miniLab as “the most important thing humanity has ever built”. But at best, the lab could do immunoassays using microfluidics. The tiny blood sample had to be diluted extensively (for which there are no reference standards or precedents), leading to artefacts and spurious results.

That (and other reports) sound like they diluted way more than usual.

[1] https://www.nature.com/articles/d41586-018-05149-2


While it is published on their site, it appears to really just be a summary of the book by Carreyrou? And I have yet to see anyone actually publish any of the dilution ratios and compare that to any existing process, so it seems like an easy-but-easily-incorrect leap to conclude that it is "way more than usual".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: