10 Incredible Ways Technology May Make Us Superhuman

In the last half of the twentieth century, medical science came up with some pretty astonishing ways to replace human parts that were starting to wear out. Though the idea is commonplace now, the invention of the artificial pacemaker in the ’50s must have seemed like science fiction come to life at the time; today’s innovations routinely restore a modicum of hearing to the deaf, sight to the vision-impaired, and if a pacemaker won’t cut it, we can just replace that faulty heart like the water pump in your old Ford.

These technologies that were in their infancy just a few decades ago are now so well-established as to seem downright mundane. The medical tech that is in its infancy today likewise seems like science fiction—and if history has taught us anything, it’s that this means we’ll probably see a lot of it in use very soon (if it isn’t already). Oh, and while there are certainly applications for many of these to replace those worn-out parts, many others are intended specifically to improve upon perfectly good parts in unprecedented ways.

Brainmachine

A “BCI” is exactly what it sounds like—a communication link between the human brain and an external device. BCIs have been the realm of sci-fi for decades, but believe it or not this hasn’t been speculative technology for some time—there are many different types of completely functional interfaces for a variety of applications, and the earliest devices of this type to be tested in humans showed up in the mid ’90s. And, it’s safe to say that the research is not slowing down.

It has been known since the 1920s that the brain produces electrical signals, and it was speculated since then that those signals might be directed to control a mechanical device—or vice versa. Since research into BCIs began in earnest in the ’60s (with monkeys as the usual test subjects), many different models with different levels of “invasiveness” depending upon the application have been produced, with research progressing particularly quickly within the last 15 years or so.

Most applications involve either the partial restoration of sight or hearing, or the restoration of movement to paralysis sufferers. One completely non-invasive prototype was demonstrated to enable a paralyzed stroke victim to operate a computer in early 2013. In a nutshell, the device picks up the eyes’ signals that are routed to the back of the brain, and analyzes the different frequencies to determine what the patient is looking at—enabling them to move a cursor on a screen using only eye movements, using a device that amounts to a helmet.

Exoskeletons-Sf-Vision-Of-The-Future-Is-Already-Here05

The general public’s concept of the powered exoskeleton is more like “powered battle armor” on account of the Robert Heinlein novel “Starship Troopers” and also a very popular character from an increasingly pervasive multimedia franchise. The tech that’s actually being developed is less geared toward battling giant robots and invading aliens, and more toward either restoring mobility to the disabled, or augmenting endurance and load-carrying capacity.

For example, one company manufactures a 50 pound aluminum and titanium suit called the Ekso that has seen use in dozens of hospitals around the U.S. It has made people with paralyzing spinal cord injuries able to walk, an application that was once too impractical due to the bulk and weight of such a suit.

The same technology was licensed by Lockheed Martin for their Human Universal Load Carrier (HULC, oddly enough), which has been extensively tested and may be deployed for military use within a year. It enables a normally conditioned man to carry a 200 pound load at ten miles per hour, pretty much indefinitely, without breaking a sweat. While the Ekso takes pre-programmed steps for its users, the HULC uses accelerometers and pressure sensors to provide a mechanical assist to the user’s natural movements.

We should note that a Japanese firm has produced a similar device with medical applications called “Hybrid Assistive Limb” or HAL, which—as the name of a famously murderous machine—we’re thinking might not have been such a hot idea. Oh, and the company’s name? Cyberdyne. We are not kidding.

Brain-Neural-Implant

A neural implant is any device which is actually inserted inside the grey matter of the brain. While a neural implant can be a BCI and vice versa, the terms are in no way synonymous. What exoskeletons do for the body, implants do for the brain—while most are meant to repair damaged areas and restore cognitive function, others are meant to give the brain a power assist or a pathway to external devices.

The use of neural implants for deep brain stimulation—the transmission of regularly spaced electrical impulses to specific regions of the brain—has been approved by the Food and Drug Administration to treat various maladies, with the first approval coming in 1997. It has been proven effective at treating Parkinson’s disease and dystonia, and has also been used to treat chronic pain and depression with varying degrees of efficacy.

Thus far, the most commonly used neural implants are cochlear implants (approved by the FDA in 1984) and retinal implants, both pioneered in the 1960s and proven effective at partially restoring hearing and vision, respectively. Fun fact: the inventor of the cochlear implant was Dr. House—William House, who passed away in 2012, and whose brother Howard was also a physician.

Prosthetic3

Prosthetics have been used to replace missing limbs for decades, but the modern version—cyberware—strives not for just an aesthetic replacement, but a functional one. That is, to restore a missing limb with a natural functionality and appearance. And while the use of aforementioned brain interfaces to control robotic prosthetic devices is already happening, other explorations in this field seek to remove limitations inherent to this scheme.

Many existing devices use non-invasive interfaces that detect the subtle movements of, say, chest and/or bicep muscles to control a robotic arm. Modern devices of this type are capable of some pretty fine motor movement, improving drastically in this respect over the last decade or so. Also in this field, research is underway to provide a two-way interface—a robotic prosthetic that will allow the patient to FEEL what they are touching with their artificial limb; but even this only scratches the surface of what’s being envisioned for the future of this tech.

At Harvard, the emerging fields of tissue engineering and nanotechnology have been combined to produce a “cyborg tissue”—an engineered human tissue with embedded, functional, bio-compatible electronics. Says research team lead Charles Lieber: “With this technology, for the first time, we can work at the same scale as the unit of biological system without interrupting it. Ultimately, this is about merging tissue with electronics in a way that it becomes difficult to determine where the tissue ends and the electronics begin.” And we are officially talking about full-on cyborg technology, in development right now.

Digital-Mind

Extrapolating many concepts from the previous examples into the future, consider the Exocortex. This is a theoretical information processing system that would interact with, and enhance the capabilities of, your biological brain—the true merging of mind and computer.

This doesn’t just mean that your brain would have better information storage (though it would mean that), but better processing power—exocortices would aid in high-level thinking and cognition, and if that sounds a little heavy, remember that humans have long used external systems for this purpose. After all, we couldn’t have modern mathematics and physics without the ancient technologies of writing and numbers, and computers are merely another plot on that same long, long technological graph.

Also, consider that we already use computers as extensions of ourselves. The Internet itself can be thought of as a sort of prototype of this very technology, as it gives us all access to vast stores of information; and the devices we use to access it—our computers—give us the means by which to process and assimilate that information with our brains, which are just bigger processing devices. Merging the two processors can theoretically give us the means by which to truly level the playing field in terms of human intellect, and enable us all to perform the most complex of high level mental functions with just as much ease as you are reading this article. Theoretically.

Embryo 6Wks

Human gene therapy and genetic engineering holds at once the most promise AND the most potential for a vast array of complications than perhaps any other scientific development ever. The understanding of evolution and the ability to modify genetic components is so new to science that it is a gross understatement to say that its implications are not yet understood; of the applications that are known to be possible (and there are many), the majority are still in the “too dangerous to even attempt on humans” phase of development.

The most obvious application is the eradication of genetic diseases. Some genetic conditions can be cured in adults by gene therapy, but the ability to test for said conditions in embryos is where the real promise lies—however, the ethical implications here are staggering. It’s possible to test not only for genetic diseases and abnormalities, but for other “conditions” like eye color and sex—and the possibility of actually being to design your baby from the ground up is absolutely within sight. Of course, we all know how expensive technology works in a free market, and it’s easy to envision a future where only the wealthy are able to afford “enhancements” to their offspring. Considering that we humans have demonstrated a very limited ability to reconcile differences in race, gender and sexuality, it’s safe to say that this technology may very well lead to the most complicated social issues in the history of humanity.

Indeed, researchers have been able to easily create mice with enhanced strength and endurance, and this field also includes stem cell research, with its promise to eventually be able to cure damn near anything. When it comes to the potential for increasing the durability and longevity of the human body, not many fields hold more promise—except perhaps for one…

Alveoli

Nano tech is quite prevalent in the public imagination as a likely cause of the end of the world, but this is a technology that is coming along at a lightning pace—and its medical applications, taken to their logical endpoint, hold the promise of nothing less than the eradication of all human diseases and maladies—up to and including death.

Current nano medicine applications involve new and highly accurate ways to deliver drugs to specific locations in the body, along with other treatment methods involving tiny particles—tiny on a molecular level—dispersed into the body. For example, an experimental lung cancer treatment uses nano particles that are inhaled by aerosol, settling in diseased areas of the lungs; using an external magnet, the particles are then superheated, killing the diseased cells. The body’s own response eliminates the dead cells AND the nano particles. This method has been used successfully in mice, and while it will not yet kill 100% of the diseased cells in an affected area, it’s close—and the tech is in its infancy.

Speculative uses of this technology involve the use of nano bots—microscopic, self-replicating machines that can be programmed to target cells for destruction, drug therapy or rebuilding. Of course, this could theoretically apply not only to diseased cells but damaged ones—perhaps allowing for much speedier recovery from injury and even the reversal of aging. The logical progression here ends with a remarkably durable, age-proof human body—but even if that never comes to fruition, it’s not as if this is the only way we’re attempting to cheat death with science…

1007-5841-Brain1 Cover

It is here that we get into the realm of what has become known as “Transhumanism“—the notion that we may one day be able to surpass our physical limits, to perhaps even discard our bodies or live beyond them. This notion was first suggested as a realistic prospect by Robert Ettinger, who in 1962 wrote “The Prospect Of Immortality”, and is considered a pioneering Transhumanist and the father of Cryonics.

That is essentially the study of the preservation of humans or animals (or parts of them, like the brain) using extremely low temperatures (below ?150 °C, or ?238 °F), which was the best means of preservation available at the time Ettinger wrote his book. Today’s brain preservation studies focus more on chemical preservation, which has been demonstrated on brain tissue (but not an entire brain) and does not require the ridiculous temperatures demanded by Cryonics.

This is, of course, an inexact science—researchers in the field are well aware that it’s impossible (at this point) to determine how much, if any, of what makes up a person’s mind is preserved along with the brain, no matter how physically perfect the preservation. It’s a field that relies on the further emergence of developing, overlapping sciences that are still in the purely speculative region, such as…

Android Dreams

As we’re able to replace more and more of our body parts with versions that have been engineered, grown in a lab or both, it stands to reason that we’ll one day reach an endpoint—a point at which every part of the human body is able to be replicated, including the brain. Right now, a collaborative effort between 15 research institutions is underway trying to create hardware which emulates different sections of the human brain—their first prototype being an 8 inch wafer containing 51 million artificial synapses.

Oh, the “software” is being replicated too—the Swiss “Blue Brain Project” is currently using a supercomputer to reverse-engineer the brain’s processing functions, with many elements of the activity of a rat brain having been successfully simulated. The leader of this project, Henry Markram, stated to the BBC that they will build an artificial brain within ten years.

Our muscles, blood, organs—artificial versions of all are in various stages of development, and at some point the prospect of assembling a fully functional artificial human body will be within sight. But even if we develop the software to run such magnificent machines—and having androids would be pretty cool—their applications for us would be incredibly significant with the development of a complementary technology, one that is less farfetched than it may seem…

Synthetic-Brain

We’ve previously mentioned futurist Ray Kurzweil and his insanely accurate rate of predicting new technologies. Kurzweil is of the opinion that by 2040 to 2045, we will be able to literally upload the contents of our consciousness into a computer—and he’s not even the only one who thinks so.

Of course, many argue that brain functions cannot be reduced to simple computation—that they are not “computable” and that consciousness itself poses a problem that science will never be able to solve. There is also the matter of whether an uploaded or otherwise “backed up” mind is indeed a different entity from that which was copied, a different consciousness altogether. Hopefully, these are questions that neuroscience will soon be able to answer.

But if indeed we are ever able to inject our very minds into the digital realm, the obvious implication is that our consciousness need never terminate—we need never die. We can hang out indefinitely in fantastically rendered digital worlds, and load ourselves into a Cyberdyne X-2000 Mind Vessel when we have business in the real world; transmit ourselves through space, perhaps even through time, and share knowledge instantaneously across all of humanity.

Smarter people than us are expecting these developments within your lifetime. Even if they are only partially correct, we’re going to go out on a limb and say that despite the exponential explosion of technology within the last couple decades, we ain’t seen nothing yet.

Read more: http://listverse.com/2013/05/12/10-incredible-ways-technology-may-make-us-superhuman/

Another 10 Curious Everyday Inventions

Nearly two years ago we wrote a list of everyday inventions. The list was relatively popular for its time and debunked at least one myth about the invention of peanut butter. This list is the second installment and looks at ten more items that we all come into contact with in our daily lives. These are things we tend to take for granted and we certainly wouldn’t know the name of the inventor if asked.

450Px-Lamport-Gnome-Replica-Amoswolfe

The first garden gnomes were made in Gräfenroda, a town known for its ceramics in Thuringia, Germany in the mid-1800s. Philip Griebel made terracotta animals as decorations, and produced gnomes based on local myths as a way for people to enjoy the stories of the gnomes’ willingness to help in the garden at night. The garden gnome quickly spread across Germany and into France and England, and wherever gardening was a serious hobby. Griebel’s descendants still make them and are the last of the German producers. Garden gnomes were first introduced to the United Kingdom in 1847 by Sir Charles Isham, when he brought 21 terracotta figures back from a trip to Germany and placed them as ornaments in the gardens of his home, Lamport Hall in Northamptonshire. Only one of the original batch of gnomes survives: Lampy, as he is known, is on display at Lamport Hall, and is insured for one million pounds. He is pictured above.

Im.0682 Zl

While matches existed in China in the 6th century and Europe from the 16th century, it was not until the 1800s that friction matches as we know them today were invented. The first “friction match” was invented by English chemist John Walker in 1826. Early work had been done by Robert Boyle and his assistant, Godfrey Haukweicz in the 1680s with phosphorus and sulfur, but their efforts had not produced useful results. Walker discovered a mixture of stibnite, potassium chlorate, gum, and starch could be ignited by striking against any rough surface. Walker called the matches congreves, but the process was patented by Samuel Jones and the matches were sold as lucifer matches (as they are still known in the Netherlands). In 1862, Bryant and May, the British match manufacturers began mass producing the red tipped matches we all know today, after the patent by the Lundström brothers from Sweden,

800Px-Contactlenzen Confortissimo.Jpg

Contact lenses are surprisingly older than most of us realize. In 1888, the German physiologist Adolf Eugen Fick constructed and fitted the first successful contact lens. While working in Zürich, he described fabricating afocal scleral contact shells, which rested on the less sensitive rim of tissue around the cornea, and experimentally fitting them: initially on rabbits, then on himself, and lastly on a small group of volunteers. These lenses were made from heavy blown glass and were 18–21mm in diameter. Fick filled the empty space between cornea/callosity and glass with a dextrose solution. Fick’s lens was large, unwieldy, and could only be worn for a few hours at a time. It was not until 1949 that the first lenses were produced that sat on the cornea only and allowed for many hours of wear.

Postcardadvertisinghappydaywashingmachinecirca1910

The first patent for a non-electrical washing machine was issued in England in 1692. Nearly two hundred years later, Louis Goldenberg of New Brunswick, New Jersey invented the electric washing machine (late 1800s to early 1900s). He worked for the Ford Motor Company at that time, and all inventions that were created while working for Ford under contract, belonged to Ford. The patent would have been listed under Ford and or Louis Goldenberg. Alva J. Fisher has been incorrectly credited with the invention of the electric washer. The US patent office shows at least one patent issued before Mr. Fisher’s US patent number 966677.

Drinking Can Ring-Pull Tab-1

The early metal beverage can was made out of steel and had no pull-tab. Instead, it was opened by a can piercer, a device resembling a bottle opener, but with a sharp point. The can was opened by punching two triangular holes in the lid — a large one for drinking, and a small one to admit air. This type of opener is sometimes referred to as a churchkey. As early as 1936, inventors were applying for patents on self-opening can designs, but the technology of the time made these inventions impractical. Later advancements saw the ends of the can made out of aluminum instead of steel. In 1962, Ermal Cleon Fraze of Dayton, Ohio, invented the integral rivet and pull-tab (also known as rimple or ring pull), which had a ring attached at the rivet for pulling, and which would come off completely to be discarded. These were eventually replaced almost exclusively by the stay tabs we still use today. Stay tabs (also called colon tabs) were invented by Daniel F. Cudzik of Reynolds Metals in Richmond, Virginia, in 1975.

10325331

The first rubber condom was produced in 1855. For many decades, rubber condoms were manufactured by wrapping strips of raw rubber around penis-shaped molds, then dipping the wrapped molds in a chemical solution to cure the rubber. In 1912, a German named Julius Fromm developed a new, improved manufacturing technique for condoms: dipping glass molds into a raw rubber solution. Called cement dipping, this method required adding gasoline or benzene to the rubber to make it liquid. These condoms were re-usable. Latex, rubber suspended in water, was invented in 1920. Latex condoms required less labor to produce than cement-dipped rubber condoms, which had to be smoothed by rubbing and trimming. The use of water to suspend the rubber instead of gasoline and benzene eliminated the fire hazard previously associated with all condom factories. Latex condoms also performed better for the consumer: they were stronger and thinner than rubber condoms, and had a shelf life of five years (compared to three months for rubber).

Tinfoil  550 X 374

Foil made from a thin leaf of tin was commercially available before its aluminum counterpart. In the late 19th century and early 20th century, tin foil was in common use, and some people continue to refer to the new product by the name of the old one. Tin foil is stiffer than aluminum foil. It tends to give a slight tin taste to food wrapped in it, which is a major reason it has largely been replaced by aluminum and other materials for wrapping food.
The first audio recordings on phonograph cylinders were made on tin foil. Tin was first replaced by aluminum starting in 1910, when the first aluminum foil rolling plant, “Dr. Lauber, Neher & Cie., Emmishofen.” was opened in Kreuzlingen, Switzerland.

Ballpointpentip Lessnoise

The first patent on a ballpoint pen was issued on 30 October 1888, to John J. Loud, a leather tanner, who was attempting to make a writing implement that would be able to write on the leather he tanned, which the then-common fountain pen couldn’t do. The pen had a rotating small steel ball, held in place by a socket. Then, fifty years later, with the help of his brother George, László Bíró, a chemist, began to work on designing new types of pens. Bíró fitted this pen with a tiny ball in its tip that was free to turn in a socket. As the pen moved along the paper, the ball rotated, picking up ink from the ink cartridge and leaving it on the paper. Bíró filed a British patent on 15 June 1938. Earlier pens leaked or clogged due to improper viscosity of the ink, and depended on gravity to deliver the ink to the ball. Depending on gravity caused difficulties with the flow and required that the pen be held nearly vertically. The Biro pen both pressurized the ink column and used capillary action for ink delivery, solving the flow problems.

2681716827 D30873A11A

Shampoo originally meant head massage in several North Indian languages. Both the word and the concept were introduced to Britain from colonial India. The term and service was introduced in Britain by a Bengali entrepreneur Sake Dean Mahomed in 1814, when Dean, together with his Irish wife, opened a shampooing bath known as ‘Mahomed’s Indian Vapour Baths’ in Brighton, England. During the early stages of shampoo, English hair stylists boiled shaved soap in water and added herbs to give the hair shine and fragrance. Kasey Hebert was the first known maker of shampoo, and the origin is currently attributed to him. Originally, soap and shampoo were very similar products; both containing surfactants, a type of detergent. Modern shampoo as it is known today was first introduced in the 1930s with Drene, the first synthetic (non-soap) shampoo.

3765477415 7F58532Dd0

Up to and including the 19th century, candy of all sorts was typically sold by weight, loose, in small pieces that would be bagged as bought. The introduction of chocolate as something that could be eaten as is, rather than used to make beverages or desserts, resulted in the earliest bar forms, or tablets. In 1847, the Fry’s chocolate factory, located in Union Street, Bristol, England, moulded the first ever chocolate bar suitable for widespread consumption. The firm began producing the Fry’s Chocolate Cream bar (arguably the best tasting chocolate bar in the world in my opinion) in 1866. Over 220 products were introduced in the following decades, including production of the first chocolate Easter egg in UK in 1873 and the Fry’s Turkish Delight (or Fry’s Turkish bar) in 1914. By 1919 the company merged with Cadbury’s chocolate and the joint company named British Cocoa and Chocolate Company.

This article is licensed under the GFDL because it contains quotations from Wikipedia.

Read more: http://listverse.com/2009/08/20/another-10-curious-everyday-inventions/

Top 10 Worst Firearms in History

After seeing the list for the 10 best firearms, I decided it would be amusing to do a list of the 10 worst firearms. For criteria for the worst firearms, I looked at reliability, safety, and utility of the weapons in the time of which they were made. If you think I have missed any awful firearms, be sure to mention them in the comments for a follow up list.

Images-11

While these rifles were a good increase in firepower for the people of the Old West starting in the 1830s, they had some very noticeable drawbacks. For all variants, there was a leak of the gases of firing at the front of the cylinder and a corresponding drop in muzzle velocity. For the double-action variants, as the cylinder cycled for each shot the just-fired tube had a tendency to send hot gas at the hand of the firer. This only ranks a 10 on the list because the problems were tolerable in comparison to the benefits of more firepower.

Images-1-4

The Liberator was a single-shot pistol stamped out of sheet metal for dropping behind enemy lines into the hands of resistance movements during WWII. It was lacking because you only got a single .45 ACP shot at an enemy who probably had a semi-automatic pistol/rifle or a fully automatic submachine gun. Also reloading was extremely troublesome as you had to push a stick down the barrel to push the spent cartridge out.

Images-2-2

The Gyrojet was a hand-held rocket launcher developed in the 1960s. It fired 13mm rockets. Differing from most firearms in that the velocity increased after the projectile left the barrel. One major problem though was that it often lacked the power to kill at close-range which is really not good for a pistol design. On some occasions the projectile just fell out the end of the barrel.

Images-3-2

The Boys Anti-tank rifle was an early anti-tank weapon unsuccessfully used at the beginning of WWII. It was a five-shot rifle that weighed 16.33kg (36lb) and fired a 13.97mm (.55) caliber armor-piercing round capable of penetrating 21mm of armor at 300m. It was under-powered at the start of WWII as it could not cope with German panzers armor. It was also a bit heavy for a soldier to lug around and its recoil was ferocious.

Images-4-2

The Nock Volley gun first appeared around 1780 and fired seven .50 caliber slugs at the same time. It was good in repelling boarders in navel combat but its recoil could break the firer’s shoulder. It also had a tendency to set ships rigging on fire from the muzzle blast.

Images-5-2

Perhaps one of the least well-known firearms on the list, the Cochran revolvers had a cylinder that revolved horizontally. Basically it meant that every time you fired it you had a loaded round pointed at you. Everything had to have been machined precisely because if a tube was bored just a fraction of an inch too deep it meant that the round pointed at you would fire as well.

Images-6-2

A Japanese pistol design of WWII that fired 8mm Taisho 14 rounds. It was underpowered, cumbersome, awkward to use, and extremely unsafe. Since the firing sear projects from the sides, it is easy to fire by accident. It was possible to fire a cartridge before it was fully in the chamber, and was considered more dangerous to the user than its target.

Images-7-2

The pepper box revolver was used mainly before the colt style of revolver caught on. It was heavy because of the multiple barrels, sometimes all of the shots would go off at once because of chain-firing and break one’s wrist, sometimes it would explode, and it was widely inaccurate. According to some wits the safest place to be when it went off was right in front of it.

1625170 F520

This pick is more literally a firearm than the other ones on the list as it is a German WWI era flamethrower. It was manned by a 2 person team and was only operated by convicts because of extreme danger. Basically it was a bomb with a guaranteed 2 people in close proximity. It was large and heavy and made an ideal target. Also since the Allied soldiers found it to be barbaric, they were very unlikely to let the operator surrender alive.

Images-8-2

A French light machine gun that was so bad that soldiers issued it threw it away in favor of rifles. Issued during WWI it was so shoddily constructed that the parts were not interchangeable from one chauchat to another. The magazine with its big holes in the sides begged for dirt and mud to mix with the cartridges which resulted in immediate jamming, leaving the weapon useless especially since trench warfare is all about mud and dirt. The main reason the Chauchat is at the top of the list of worst firearms is that there were so many decent light-machine guns around at the time of issue, and they still issued this piece of junk instead.

Read more: http://listverse.com/2012/05/06/top-10-worst-firearms-in-history/

Top 10 Scientists Killed or Injured by Their Experiments

Man owes a great debt to the scientists on this list; all of them died or were injured in their pursuit of knowledge. The advances they have all made to science are extraordinary and many of them paved the way for some of man’s greatest discoveries and inventions.

Hskarlwi

Scheele was a brilliant pharmaceutical chemist who discovered many chemical elements – the most notable of which were oxygen (though Joseph Priestley published his findings first), molybdenum, tungsten, manganese, and chlorine. He also discovered a process very similar to pasteurization. Scheele had the habit of taste testing his discoveries and, fortunately, managed to survive his taste-test of hydrogen cyanide. But alas, his luck was to run out: he died of symptoms strongly resembling mercury poisoning.

Picture 2-17

Jean-Francois was a teacher of physics and chemistry. In 1783 he witnessed the world’s first balloon flight which created in him a passion for flight. After assisting in the untethered flight of a sheep, a chicken, and a duck, he took the first manned free flight in a balloon. He travelled at an altitude of 3,000 feet using a hot air balloon. Not stopping there, De Rozier planned a crossing of the English Channel from France to England. Unfortunately it was his last flight; after reaching 1,500 feet in a combined hot air and gas balloon, the balloon deflated, causing him to fall to his death. His fiancee died 8 days later – possibly from suicide.

0 Hill And Adamson Brewster

Sir David was a Scottish inventor, scientist, and writer. His field of interest was optics and light polarization – a field requiring excellent vision. Unfortunately for Sir David, he performed a chemical experiment in 1831 which nearly blinded him. While his vision did return, he was plagued with eye troubles until his death. Brewster is well known for having been the inventor of the kaleidoscope – a toy that has brought joy to millions of children over the years.

Picture 1-31

Elizabeth Fleischman Ascheim married her doctor, Dr Woolf, shortly after her mother died. Because of his medical position, Woolf was very interested in the new discovery of Wilhelm Conrad Röntgen – x-rays. His new wife became equally interested and she gave up her job as a bookkeeper to undertake studies in electrical science. Eventually she bought an x-ray machine which she moved in to her husbands office – this was the first x-ray lab in San Francisco. She and her husband spent some years experimenting with the machine – using themselves as subjects. Unfortunately they did not realize the consequences of their lack of protection and Elizabeth died of an extremely widespread and violent cancer. Information on Ascheim is scarce, so I recommend you read this PDF on her life.

Alexander Bogdanov

Bogdanov was a Russian physician, philosopher, economist, science fiction writer, and revolutionary. In 1924, he began experiments with blood transfusion – most likely in a search for eternal youth. After 11 transfusions (which he performed on himself), he declared that he had suspended his balding, and improved his eyesight. Unfortunately for Bogdanov, the science of transfusion was a young one and Bogdanov was not one to test the health of the blood he was using or the donor. In 1928, Bogdanov took a transfusion of blood infected with malaria and tuberculosis. Consequently he died shortly after.

6989-004

Robert Bunsen is probably best known for having given his name to the bunsen burner which he helped to popularize. He started out his scientific career in organic chemistry but nearly died twice of arsenic poisoning. Shortly after his near-death experiences, he lost the sight in his right eye after an explosion of cacodyl cyanide. These being excellent reasons to change fields, he moved in to inorganic chemistry and went on to develop the field of spectroscopy.

Davy1

Sir Humphrey Davy, the brilliant British chemist and inventor, got a very bumpy start to his science career. As a young apprentice he was fired from his job at an apothecary because he caused too many explosions! When he eventually took up the field of chemistry, he had a habit of inhaling the various gasses he was dealing with. Fortunately this bad habit led to his discovery of the anesthetic properties of nitrous oxide. But, unfortunately, this same habit led to him nearly killing himself on many occasions. The frequent poisonings left him an invalid for the remaining two decades of his life. During this time he also permanently damaged his eyes in a nitrogen trichloride explosion.

Michael-Faraday

Thanks to the injury to Sir Humphrey Davy’s eyes, Faraday became an apprentice to him. He went on to improve on Davy’s methods of electrolysis and to make important discoveries in the field of electro-magnetics. Unfortunately for him, some of Davy’s misfortune rubbed off and Faraday also suffered damage to his eyes in a nitrogen chloride explosion. He spent the remainder of his life suffering chronic chemical poisoning.

Curie

In 1898, Curie and her husband, Pierre, discovered radium. She spent the remainder of her life performing radiation research and studying radiation therapy. Her constant exposure to radiation led to her contracting leukemia and she died in 1934. Curie is the first and only person to receive two Nobel prizes in science in two different fields: chemistry and physics. She was also the first female professor at the University of Paris.

300Px-Galileo.Arp.300Pix-1

Galileo’s work on the refinement of the telescope opened up the dark recesses of the universe for future generations, but it also ruined his eyesight. He was fascinated with the sun and spent many hours staring at it – leading to extreme damage to his retinas. This was the most likely cause of his near blindness in the last four years of his life. Because of his life’s work, he is sometimes referred to as the “father of modern physics”.

Slotin Los Alamos-1

I normally don’t update a list once it is posted (aside from correcting factual errors) but mudbug raised an interesting addition that I hadn’t heard of – so here it is. Canadian born Slotin worked on the Manhattan project (the US project to design the first nuclear bomb). In the process of his experimentation he accidentally dropped a sphere of beryllium on to a second sphere causing a prompt critical reaction (the spheres were wrapped around a plutonium core). Other scientists in the room witnessed a “blue glow” of air ionization and felt a “heat wave”. Slotin rushed outside and was sick. He was rushed to hospital and died nine days later. The amount of radiation he was exposed to was equivalent to standing 4800 feet away from an atomic bomb explosion. This accident prompted the end of all hands-on assembly work at Los Alamos. I strongly recommend you read the Wikipedia article on this critical event.

Notable mentions: Rosalind Franklin

Read more: http://listverse.com/2008/06/04/top-10-scientists-killed-or-injured-by-their-experiments/