Health and Well-Being · National Security · Prosperity
Federally-funded research gave us...
treatments for melanoma
We're crowd-sourcing societal benefits
that stem from government-funded research in the United States.
54 examples and counting.
Please help -- it only takes a minute to click and
send in an idea for an example.
Even if all you have is a title for a possible example and a link to point us to further reading, that's fine, we can take it from there.
Also, your idea for an example doesn't have to be perfect or earth-shaking. If a straight line can be drawn from the research to a realized or expected tangible benefit, then that's great as an example 😊👍. Thanks for helping!
Even if all you have is a title for a possible example and a link to point us to further reading, that's fine, we can take it from there.
Also, your idea for an example doesn't have to be perfect or earth-shaking. If a straight line can be drawn from the research to a realized or expected tangible benefit, then that's great as an example 😊👍. Thanks for helping!

To help astronauts survive long-term missions, NASA started thinking about ways to grow food in space. This required reducing the amount of ethylene in the air. Ethylene is a gas released by plants, and its build-up in a closed environment (like the space station) leads to wilting and accelerated plant decay. NASA funded research at the University of Wisconsin Madison to work on this problem. It turns out that the system the researchers developed to remove ethylene from the air can also be used to remove other organic particles, making it the basis for a reportedly high quality air purifier. The technology was licensed by a private company, and is now used to build home and commercial air purifiers.
If you heard that the National Institutes of Health was funding research on lizards, you might at first think that the N.I.H. was wasting taxpayer money. But in fact, a researcher at the Veteran’s Affair’s Medical Center in the Bronx discovered that the venom of Gila monsters (lizards from the Southwest of the U.S.A.) is an excellent promoter of insulin, the hormone that our bodies use to regulate sugar levels. Further work from a researcher at the National Institute on Aging helped take that discovery to the clinic and to production of a drug now widely used to combat type 2 diabetes. Thanks to government-funded scientists and their research, Gila monsters are saving lives.
Cancer is a devastating disease. Each year, approximately two million people are diagnosed with cancer, with half a million people dying from cancer-related causes. Of these, late stage melanoma is one of the deadliest. However, funded by the National Institutes of Health, basic research into immunology in the 1990s by James Allison (who was then at UC Berkeley, and is now at the M.D. Anderson Cancer Center in Houston, TX) led to the development of the drug Ipilimumab.Our immune system normally protects us from invaders (like bacteria and viruses) by attacking the invaders. Cancer is also an invader, but many cancers manage to block the immune system attack. Allison discovered how to prevent that blockage, thus unleashing the immune system on cancerous tumor cells. This is the basis for how ipilimumab works. Ipilimumab has substantially improved life expectancy and lowered death rates for individuals with a poor prognosis in melanoma. Astonishingly, a subset of individuals even achieved complete elimination of their cancer, something previously thought impossible in late stage melanoma.For his impact on immunology and the treatment of melanoma, James Allison (a “blues-loving scientist from the small town of Alice, Texas”) was awarded the 2018 Nobel Prize in Physiology or Medicine. With this research, the N.I.H. helped launch a new era of cancer immunotherapy and ongoing development of many new drugs that seek to harness our immune systems to attack malignant tumors.
The Internet is fundamental to many aspects of our lives (communications; commerce; entertainment; etc.) It began as the ARPANET, a project funded by the Advanced Research Projects Agency of the Department of Defense, and as NSFNET, funded by the National Science Foundation. This government-funded work, built initially to connect researchers to each other, now connects the entire world and powers our economy.
Before touchscreens became the standard form of interaction with our phones, devices, and electronics, they were the Ph.D. thesis project of a graduate student at the University of Delaware, who was funded by a scholarship from the National Science Foundation. Wayne Westerman’s dissertation developed the multi-touch capability that was necessary for touchscreens to take off as a commercial product. The company that he and his Ph.D. adviser started was purchased by Apple, incorporated into the iPhone, and the rest is history.
Postpartum depression (PPD) affects 1 in 8 women and is a primary cause of maternal mortality following childbirth. Until recently there was no treatment specifically for PPD, with standard antidepressants proving ineffective and too slow. Using funding from the NIH, basic research into a hormone called allopregnanolone revealed its role in regulating specific types of neurons in the brain associated with PPD. This research eventually led to the first ever FDA-approved drug for treating PPD, brexanolone. This drug is not only highly effective in comparison to standard antidepressants, but importantly takes effect within hours instead of the multiple weeks associated with standard treatments. Thanks to the basic research identified and supported by the NIH, we now have an effective treatment for PPD.
Today, GPS is engrained in our lives, from figuring out the best route home after a road closure to making sure a loved one reached their destination safely. The modern system, which now consists of 31 satellites orbiting the earth, five ground control stations located around the globe, and many receivers which are found in our phones, cars, etc., began as a project under the Department of Defense in 1973. The work was carried out across several federally funded institutions: the US Naval Research Laboratory, The Johns Hopkins University Applied Physics Laboratory, and The Aerospace Corporation. Fully operational in 1993, the full capabilities of the system became freely accessible across the globe for peaceful purposes in 2000.
Anesthesia, while essential for surgeries, requires careful monitoring. You want the patient to be anesthetized enough that they won’t feel or remember anything. But you need to be careful to not anesthetize them so deeply that the anesthesia itself might cause later complications. For example, overuse of some important anesthetics such as propofol can sink a patient into a state known as “burst suppresion”. If a patient falls into this state during surgery, then after they wake up, they may experience confusion, delirium and memory loss – sometimes for months. We want to avoid that.So, how much is too much propofol? How can the anesthesiologist tell if the patient is getting close to the burst suppression state, so that they can then back off a bit? In research funded by the National Institutes of Health, a team at the Massuchesetts Institute of Technology (MIT) used a mathematical tool known as state-space modeling to analyze the EEGs of patients undergoing anesthesia. They discovered that the strength of fluctuations in the amplitude of a particular type of brain waves is a very good measure of how close a patient is to the burst suppression state. This means that those fluctuations can be used to tell when to back off on the propofol.While this result is too new to have made it into the clinic quite yet, it is very promising: using the new measure that the team discovered, anesthesiologists will be able to better protect their patients.
When a tornado begins to form, high-speed winds several kilometers above the ground begin to move in a tight rotation known as a “Tornado Vortex Signature” (TVS). Detecting this raises a warning about a possible tornado well before the tornado touches down on the ground. But how can we tell this distant TVS is happening? Tornadoes often form in thunderstorms, and therefore carry droplets of rain with them. Which leads us to the secret sauce: radar signals bounce off rain droplets, and how they bounce off the droplets is affected by the motion of the droplets. That can be leveraged into using radar to measure the speed of the droplets, and detect the TVS.How is that leveraging done? In 1973, researchers in Norman, Oklahoma, working for the United States Weather Bureau (which later became part of NOAA) figured out how to do it using a principle known as the Doppler effect. You know how a siren on an ambulance sounds high-pitched if the ambulance is driving towards you, but sounds lower-pitched once the ambulance is past you and is driving away? The very same applies to radar and rain droplets: the radar tower sends out a signal at some frequency. If it bounces off a droplet that is moving away, the radar signal that comes back to the radar tower will have a lower frequency. But if the droplet is moving towards the radar tower, the radar signal that comes back to the tower will have higher frequency. How much higher or lower depends on the droplet’s speed. Using this, radar can be used to measure storm wind speed and direction. When two close-by locations are measured to have very different wind speeds, or even opposing wind directions, that suggests the tight rotation that is a TVS.This principle forms the basis of modern tornado early-warning systems, and helps to keep millions of people safe.
Our sense of vision relies on many different genes functioning properly. One of those genes is called RPE65. People who did not inherit a working copy of RPE65 suffer from Leber’s Congenital Amaurosis (LCA): a rare but severe type of vision loss which, tragically, strikes from early childhood.Recently, an approach known as gene therapy has made very important progress. Gene therapy involves injecting working copies of genes in a way that allows the injected working copies to replace malfunctioning versions in the patient. Funded by the National Institutes of Health, researchers at the Children’s Hospital of Pennsylvania created a gene therapy treatment for RPE65. Their method was then further developed into the commercially-available treatment Luxturna. It is one of the first disease-treating gene therapies approved by the FDA, and it is greatly improving vision in people with LCA. Importantly, it is also helping to pave the way toward more gene therapies for other diseases, to improve the lives of more people.
Influenza, commonly known as the flu, is a contagious respiratory illness caused by influenza viruses that infect the nose, throat, and sometimes the lungs. It can cause mild to severe illness, and at times can lead to serious complications like pneumonia, or even death. There are four main types of influenza viruses: A, B, C, and D. Influenza A and B viruses are the types that cause the seasonal epidemics of disease in humans almost every winter. Influenza A viruses are further divided into subtypes based on two proteins on the surface of the virus (like H1N1 and H3N2), while influenza B viruses are broken down into specific lineages. These viruses are constantly changing, which is why new flu vaccines are needed each year. In the United States, the flu is a significant public health concern annually, with the Centers for Disease Control and Prevention (CDC) estimating it causes between 9 million and 45 million illnesses, between 140 000 and 710 000 hospitalizations, and between 12 000 and 52 000 deaths each year, depending on the season’s severity. While flu pandemics had occurred before, scientists didn’t actually isolate the virus responsible for human flu until 1933. A British team, including Wilson Smith, Christopher Andrewes, and Patrick Laidlaw working at the National Institute for Medical Research (NIMR) in London, were the first to identify the influenza A virus. Understanding the virus was the first step towards preventing it. The push for a vaccine gained urgency during World War II, as military leaders feared flu outbreaks could cripple the armed forces. The U.S. Army established a specific group, the Army Epidemiological Board’s Commission on Influenza, to tackle this problem. This commission funded and directed research led by Dr. Thomas Francis Jr. (who headed the commission) and his junior colleague Dr. Jonas Salk at the University of Michigan. They developed the first effective inactivated flu vaccine, using killed virus. This vaccine was first tested for safety and effectiveness on U.S. military personnel before being licensed for civilian use in 1945. Today, flu vaccination is recommended annually for most people aged 6 months and older in the United States. Each year, tens of millions of Americans get their flu shot, although vaccination rates vary by age group and season, often hovering around 40-60% overall coverage. While getting vaccinated isn’t a guarantee against catching the flu, it’s the best way to prevent it and its potentially serious complications. The CDC estimates that flu vaccination prevents millions of illnesses and tens of thousands of hospitalizations each year. For example, during the 2019-2020 season alone, flu vaccination was estimated to have prevented about 7.5 million influenza illnesses, 3.7 million flu-associated medical visits, 105 000 hospitalizations, and 6 300 deaths in the U.S. These numbers highlight the significant public health benefit of widespread annual flu vaccination.
Microchips, technically called integrated circuits, are miniature marvels of engineering, typically made from silicon and containing millions or even billions of microscopic electronic components like transistors. These components work together to process data, perform calculations, and control functions, acting as the operational core for nearly all modern electronics. You find them everywhere: in smartphones, computers, televisions, cars, washing machines, sophisticated medical devices, and critical infrastructure. This ubiquity makes the semiconductor industry, responsible for designing and manufacturing microchips, a massive global market worth well over $500 billion annually, underpinning vast sectors of the world’s economy.While Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor invented the first integrated circuits around 1958-1959, U.S. government funding and demand were crucial for turning this invention into a practical technology. Specifically, the U.S. Air Force required highly reliable, small electronics for its Minuteman missile guidance systems in the early 1960s, providing large, vital early orders. Simultaneously, NASA needed advanced, lightweight computers for the Apollo program, contracting MIT’s Instrumentation Laboratory to design the groundbreaking Apollo Guidance Computer, which heavily used early microchips. This military and space agency demand guaranteed a market, funded crucial manufacturing advancements, and drastically lowered costs, paving the way for commercial use.
Many Americans experience discomfort after consuming dairy because they have lactose intolerance, meaning they cannot properly digest lactose, the main sugar in milk. This happens because their small intestine doesn’t produce enough of an enzyme called lactase. Without sufficient lactase, the lactose travels undigested to the colon, leading to gastrointestinal issues. Addressing this common problem, USDA chemist Virginia Harris Holsinger pioneered a solution in the 1980s. She discovered that adding lactase enzyme sourced from non-human origins, such as fungi, directly into milk could effectively break down the problematic lactose. Holsinger’s method works because the added lactase enzyme pre-digests the milk sugar, breaking the lactose down into simpler, easily digestible sugars: glucose and galactose. The resulting milk tastes slightly sweeter than regular milk but provides a way for lactose-intolerant individuals to enjoy dairy without negative symptoms. Her research was successfully commercialized by the company Lactaid. They introduced lactose-free milk to the market and later expanded the concept to offer a variety of other popular dairy products, including ice cream, cottage cheese, sour cream, and even eggnog, suitable for those with lactose intolerance.
Solar panels work by capturing energy from sunlight. They contain many small cells, usually made of silicon, which act like tiny electricity generators. When sunlight (photons) strikes these cells, it knocks electrons loose, creating a flow of direct current (DC) electricity through the photovoltaic effect. Because most homes and businesses use alternating current (AC), an inverter is used to convert the DC electricity into usable AC power. Solar power is incredibly useful because it’s a clean energy source that doesn’t produce greenhouse gas emissions while operating, taps into the renewable power of the sun, and can help lower electricity costs. Following early private inventions, US public institutions significantly advanced solar technology. The Department of Energy (DOE) has funded extensive research, notably establishing the National Renewable Energy Laboratory (NREL) in 1977, which remains dedicated to improving solar efficiency and affordability. Many public universities across the country also conduct vital, often DOE-funded, research in materials and engineering. Key contributors include institutions like Arizona State University, known for its dedicated solar programs, the University of Delaware with its early Institute of Energy Conversion, and major research hubs within the University of California system. This sustained effort, combined with falling prices, has led to explosive growth. In recent years, solar power has consistently accounted for a massive portion, often around half, of all new electricity-generating capacity added to the U.S. grid each year.
CRISPR-Cas9 gene editing is a groundbreaking technology that has had widespread impact on society, from changing the way we combat diseases to improving the foods that we eat. What was originally research into the genetic code of bacteria led to an interesting discovery. Scientists noted strange repeated sequences, which were named clustered regularly interspaced short palindromic repeats, or CRISPR for short. They also discovered that these sequences effectively acted like an immune system, allowing bacteria to “remember” past viral attacks to fight infections in the future. In 2012 biochemists Jennifer Doudna at UC Berkeley and Emmauelle Charpentier showed how to harness CRISPR along with an enzyme known as Cas9 to cut DNA at exact locations. Ever since then, CRISPR-Cas9 gene editing technology has been adapted to cure diseases, improve crops, and aid in our quest to better understand genetics.
Am I pregnant? There have been methods to answer this question dating all the way back to the ancient Egyptians in 1350 BCE. Methods became more advanced over time, but they remained largely unreliable until there was a greater understanding of the hormonal and chemical changes that occur in the female body during pregnancy. In the 1960s and 70s the National Institute of Health become one of the foremost places in the entire world to conduct such research, due to a combination of proper funding, expertise, and the ability to conduct what, in many settings, would be considered very tedious work due to the difficulty of extracting and synthesizing hormones.Such a place would attract Judith Vaitukaitis and Glenn Braunstein, both medical residents in Boston, who would start doing fundamental research to understand human chorionic gonadotropin (hCG), a hormone that was known to be linked to pregnancy and certain types of cancers. Vaitukaitis and Braunstein would come to understand more about the structure of hCG, and subsequently, develop more advanced tools to detect hCG. These findings would lead to the development of the first at home early pregnancy test, which would give women a safe and reliable way to determine pregnancy as early as 8-10 days post conception.
HIV (human immunodeficiency virus) infection, which untreated results in AIDS (acquired immunodeficiency syndrome) has killed over 40 million people worldwide since it was first identified in the early 1980s. While there is still no vaccine to prevent HIV infection, highly effective pre-exposure prophylaxis treatment, or PrEP, has become an important aspect of limiting the spread of HIV since it was approved in the US in 2012 after clinical trials funded through the National Institute of Allergy and Infectious Disease. Born out of research into antiretroviral drugs for treatment, these daily medications are taken by people who are at higher-risk for getting HIV, and have been shown to drastically reduce the chance of infection if exposed to HIV. In the UK, for example, the number of new infections amongst gay men dropped for the first time ever the year PrEP began being perscribed as part of the first trial in that country in 2017.Access to these medications is a key part of the push to eliminating infections and deaths due to HIV across the United States and globally. While new infections worldwide peaked in 2004, HIV/AIDS remains one of the world’s most deadly infectious diseases, particularly across Sub-Saharan Africa. Given just how effective these medications have been, broad access to PrEP has the potential to stop this ongoing global epidemic.
Have you ever used a key-card to open the room to your office or hotel room? Have you used a tap-payment to quickly pay for a coffee or groceries? Radio Frequency Identification (RFID) is the basis for being able to safely and securely use electronic locks and tap-to-use credit cards without worrying that somebody else’s card can also open your office or charge your card. It is in widespread use in offices and stores, and even in more modern home security systems. Developing RFID required ingenuity in figuing out ways to pattern individualized circuits that can be put in a card which, when scanned by the “lock”, gave a personalized identification code which the system can use to figure out if that key had access or not. Importantly it let these cards work without batteries: making them extremely useful in many settings.RFID technology was made possible by research performed in a federal lab, specifically the Los Alamos National Labs, and funded by federal agencies including the Atomic Energy Comission and the Animal and Plant Health Inspection Service of the U.S. Department of Agriculture.
Many of the technologies we have at our fingertips take advantage of huge computer systems that crunch massive amounts of numbers. Take the weather app on your phone: to predict the weather in your home state (and even specifically your city), the weather across the world needs to be recorded and fed into a computer model that can take those millions of numbers and figure out how they will impact whether or not you need to take an umbrella to work today. Similarly, the airplanes we use to travel, and the jets that comprise our air force, make use of computer models that figure out how the air at high velocities will impact how bumpy your ride is. The ability to run these computer models is thanks to supercomputers: enourmous Costco-sized buildings full of stacks of computers that work together to process all of that data and transmit the results out to the world-and eventually your phone. Federal funding has a long history of supporting the development of the supercomputer, from the DARPA funded “ILLIAC IV” that was developed at the University of Illinois Urbana Champaign in 1964, through the High Performance Computing Modernization Plan (HPCMP) of the 1990’s co-led by NSF and DARPA, up to modern day continuation of these programs involving NSF, DARPA, DOE, and multiple national labs. These efforts paved the way not just for the computers housed and used federally, but many of the same technology runs the servers that are used in private companies to house your email, connect you to social media, and many more.
In the early 2000s, the NSF funded Alvin Roth, Tayfun Sonmez, and M. Utku Unver in applying economic theory to the problem of kidney donations. At the time, extensive waiting queues prevented tens of thousands of patients from obtaining the new kidneys they needed to survive. Roth and colleagues recognized a similarity between the problem of incombatible donors (when a patient’s immune system is incompatible with a donor’s) to existing theories described by LLoyd Shapley about efficiently matching pairs of people to each other based on their preferences.This research led to the development of a large-scale kidney donation program, which allows individuals all over the nation to donate to one another in complex exchanges, while simultaneously retaining the incentives that drive people to donate to their relatives in need. Their framework specifically excludes the consideration of monetary incentive, which proved vital given USA laws that forbid the exchange of body parts for money. Their program has since been adopted to enable national kidney exchanges and continues to save lives by shortening kidney donation waiting times. For their work, Alvin Roth and LLoyd Shapley shared the 2012 Nobel Prize in Economics.
In the 1930s and 40s, physicists at American universities, including Isidor Rabi at Columbia and teams at Harvard and Stanford, discovered and refined Nuclear Magnetic Resonance (NMR) through government-supported research. Initially used by chemists to analyze molecular structures, the technology’s potential for medical applications remained unexplored for decades. The breakthrough came in the 1970s when Paul Lauterbur, working at the federally funded Stony Brook University, discovered how to generate spatial images from NMR signals. This transformative federally supported research converted a physics lab technique into Magnetic Resonance Imaging (MRI), revolutionizing medical diagnostics without using radiation and earning Lauterbur the 2003 Nobel Prize in Medicine.Today, MRI technology provides safe, radiation-free imaging that reveals details of soft tissues invisible to other technologies. With approximately 40 million MRI scans performed annually in the United States, the technology helps diagnose conditions from brain tumors and stroke damage to torn ligaments and cardiac problems. This federally funded basic science research has transformed into an essential medical tool that guides surgical planning, monitors disease progression, and evaluates treatment effectiveness, directly improving outcomes for millions of patients worldwide.
For modern robots, like self-driving cars, successfully navigating our complex world is a fundamental requirement. It’s not enough for them just to follow a pre-programmed path. They need to perform two critical tasks simultaneously: figuring out their precise location (“Where am I?”) while also building and updating a detailed map of their immediate environment (“What’s around me?”). This challenging process is called Simultaneous Localization And Mapping, or SLAM.Key developments enabling SLAM technology originated from research supported by agencies like NASA, initially envisioned for robots exploring unknown terrains autonomously. Today, SLAM forms the foundational software enabling many robots to continuously learn about their surroundings, accurately track their own position within that space, and intelligently plan paths towards their goals. Crucially, this allows them to safely maneuver around both stationary objects and unexpected obstacles that might suddenly appear, a capability essential for the early self-driving cars that won competitions like the DARPA Grand Challenge.
Exposure to certain viruses can lead to an increased risk of some cancers, and it used to be thought that all or most cancers had a viral origin. But in the early 1990s, a research group at U.C. Berkeley, led by professor Mary Claire King, proposed the idea that mutations in particular genes could be an important risk factor for breast cancer. Using funding from the NIH, they were able to pursue this idea. The BRCA1 gene was discovered in 1994, and its counterpart BRCA2 followed a year later. This and subsequent research showed that people with mutations in these genes have an elevated risk of developing breast cancer, and that their cancers respond to treatments in different ways than other types of breast cancer.This research directly led to novel screening tools for breast cancer risk assessment, early detection and more targeted options for prevention and treatment – nothing short of a revolution in the standard of care for breast cancer. More generally, this research played a pivotal role in the advent of “precision medicine” in which individual patients’ genetic profiles lead to personalized risk assessment and treatment for cancer and many other diseases.
Babies born prematurely are at risk and require a lot of care in order to grow, gain weight, and thrive. Work funded by the National Institutes of Health, starting in the 1970s, led to the discovery of a surprisingly effective treatment: the simple act of touch, through 15 minutes of massage therapy three times a day, can lead to a 47% increase in the baby’s weight.The story starts with Prof. Saul Schanberg’s research group at Duke University. They knew that when rat pups are separated from their mothers, they grow less well than pups that aren’t separated. They wanted to find out what made the difference. The mother’s milk? The warmth of her body? Her comforting smell? They gradually ruled all of these out and came to a surprising conclusion: it was her touch. Stroking the pups with a gentle camera brush, in a motion that mimicked how the mother licks them, allowed the pups to thrive.Following up on this research, Prof. Tiffany Field’s group at the University of Miami asked whether similar touch therapy with humans would have a similar beneficial effect. They found that it did, and the positive effect was big. Fast forward years later, and massage therapy for preterm babies in Neonatal Intensive Care Units (NICUs) is now commonplace, helping many thousands of babies grow to be thriving children.In addition to the emotional toll of having a baby in the Neonatal Intinsive Care Unit (NICU), care for babies in the NICU is expensive. The use of massage therapy to help babies thrive and successfully leave the NICU was estimated in 2014 to save about $10,000 per baby, for a total savings in the U.S. of about $4.7 billion dollars every year.
Closed captioning is important in everyday life because it helps people who are deaf or hard of hearing enjoy TV, videos, and online content. It also helps people learn new languages, watch videos in noisy places, or follow along in quiet environments like libraries. As more videos are made for websites, social media, and online learning, the need for captions is growing fast. This creates more jobs for captioners and makes sure more people can enjoy and understand video content.Publicly funded research played a key role in starting closed captioning. In the 1970s, government agencies like the National Institute of Standards and Technology worked with TV stations to test early captioning systems, after being petitioned by a deaf woman to provide captions for emergency announcements. Later, the U.S. Department of Education helped fund the National Captioning Institute (NCI), which created the first captioning for live events, including news programs, sports, and the Academy Awards. NCI also developed the first captioned children’s program, Sesame Street. Federal funding for closed captioning helped a new technology grow into a valuable service that supports jobs and helps millions of people every day.
Publicly funded research played a big role in creating Google. In the 1990s, the National Science Foundation (NSF) gave money to Stanford University for a project called the Stanford Digital Library Project. This research helped graduate students Larry Page and Sergey Brin study how to organize and search through huge amounts of information online. Brin was also awarded an NSF Graduate Research Fellowship during his graduate studies at Stanford. This fellowship supported his research in computer science, providing both financial assistance and recognition for his academic potential. Page and Brin’s graduate work on ranking web pages based on importance led to the creation of Google’s search engine.Other government programs also helped support early research that made Google possible. Agencies like DARPA and the intelligence community funded projects on managing and searching big sets of data. This gave Page and Brin access to the tools and ideas they needed to build something new. With support from their professors and government grants, they turned their research into a powerful tool for finding information on the internet—changing how the world uses technology.
About 1.45 million people in the United States have type I diabetes, a disease that renders the pancreas unable to regulate blood sugar. The pancreas normally regulates blood sugar using two hormones, one of which is insulin. Insulin helps take sugar out of the blood and into muscle or other tissues. People suffering from type I diabetes need to carefully monitor their blood sugar levels. If they see their blood sugar get too high, they should self-administer some insulin. But how much insulin? Getting the amount right can be very tricky. Too high a dose of insulin leads to hypoglycemia (which is a fancy name for too low blood sugar), too low a dose of insulin and the high blood sugar level issue has not been solved. Getting the dose wrong can lead to further health complications.In order to better control blood sugar, the NIH and universities across the United States worked together to develop artificial pancreas systems. These devices essentially try to do the pancreas’ job for it. They break the problem down into 3 components: first, monitoring how much glucose is in the blood; second, using the level read from the monitor to calculate the correct dose of insulin that should be given; and third, administering the insulin. Each of these problems is solved by one device, and the combination of all 3 is known as an artificial pancreas. Glucose monitoring is done via a tiny sensor that is placed underneath the skin, usually via some adhesive pad. A program on a smartphone or some other physical device then gets information from the sensor and uses it to calculate the proper dosage of insulin necessary to keep blood sugar at a desired level. Lastly, insulin is delivered via a tiny pump that goes underneath the skin.These systems were developed by a collaboration between the National Institute for Diabetes and Digestive and Kidney Diseases (NIDKK) and universities in the Boston area, namely Harvard Medical School and Boston University. This effort included not only prototyping and development, but also extensive clinical trials to ensure safety for young children. These systems have made it easier for people to monitor their blood sugar throughout the night and, in particular have made life much easier for young children and the elderly dealing with type I diabetes. It’s great for the doctors, too: the doctors get much more detailed information about the status of their patients than they used to, which allows them to better tailor treatment to each individual patient.
Asthma is a chronic respiratory disease involving inflammation and narrowing of the airways, causing symptoms like coughing, wheezing, and shortness of breath. It affects a vast number of people, with estimates suggesting over 260 million individuals globally suffered from it in recent years. Historically, treatment focused mainly on relieving airway spasms (bronchospasm), often with limited success and significant risks, before the focus shifted towards managing the underlying inflammation for better long-term control.Progress in asthma treatment heavily relies on foundational science supported by public funding. Key NIH institutes like the National Institute of Allergy and Infectious Diseases (NIAID), the National Heart, Lung, and Blood Institute (NHLBI), and the National Institute of Environmental Health Sciences (NIEHS) 1 conduct crucial research uncovering disease mechanisms and immune pathways. Additionally, NIH and the CDC support programs like the National Asthma Education and Prevention Program (NAEPP) and the National Asthma Control Program (NACP) to translate findings into guidelines and public health action. This deep understanding allows companies like Regeneron (New York) and Sanofi (France) to develop advanced therapies. A prime example is Dupixent, now benefitingover a million patients globally by specifically inhibiting the signaling of IL-4 and IL-13, key drivers of Type 2 inflammation in asthma.
Laser technology has allowed the development of computer hard-drives, barcode scanning in grocery stores, satellite broadcasting, CDs, eye surgeries, cancer treatments, 3D printing, missile defense, law enforcement, digital video, astronomy, and much more. The physics behind lasers was first proposed and developed in the 50s by Charles Townes working at Columbia University with funding provided by The Office of Naval Research , United States Army Signal Corps, and the National Science Foundation. After first developing the theory behind focusing light at non-visual wavelengths, he eventually refined his work with visual light, setting the stage for the invention of the modern laser. Lasers went on to prove immensely useful for a wide range of applications, and Townes eventually shared the 1964 Nobel Prize in Physics for his work.
The internet provides us with many conveniences, from the ability to shop online to connecting with friends and sharing personal and professional documents. At the heart of the internet is the ability for computers to talk to each other in a safe, secure way. Each computer must be able to take the information it wants to send (e.g., an email), package it into one or more messages that correctly arrive at their destination, and similarly be able to receive and interpret messages sent back. This ability to communicate was created using government funding from the Defense Advanced Research Projects Agence (DARPA). It consists of the Transmission Control Protocol (TCP), Internet Protocol (IP), and User Datagram Protocol (UDP). Together packaged as the more well known TCP/IP, these protocols are often considered the “backbone of the internet”.One of the important parts in internet communication is the encoding of messages into discrete packets, for easy sending. TCP/IP began moving that job from the hardware, which could be different across computers from different manufacturers, to software, making the internet more universally available regardless of the device. The TCP/IP protocols also soved a number of major challenges in making the internet function, including ensuring ordered messages, helping to correct errors as messages get transferred to their destinations, and correctly identifying the computer that is supposed to receive the messages.In 1983, ARPANET, one of the early precursors of the internet, began using TCP/IP. It’s been the backbone of the internet ever since.
Cancer encompasses a group of diseases characterized by abnormal cell growth with the potential to invade tissues and spread throughout the body, a process known as metastasis. It arises from changes or mutations in cellular DNA that disrupt normal cell function, leading to uncontrolled division. In the United States, cancer remains a significant health challenge, with estimates suggesting around 2 million new cases could be diagnosed in 2024, and it stands as a leading cause of death, responsible for over 600,000 fatalities annually in recent years. Traditional treatments like surgery aim to remove tumors, while radiation therapy uses high-energy beams and chemotherapy employs powerful drugs to kill cancer cells. Hormonal therapies are used for cancers sensitive to hormones. However, these conventional methods face limitations, including the difficulty of removing all cancerous cells surgically, potential damage to healthy tissues from radiation and chemotherapy, the development of drug resistance, and significant side effects that impact patients’ quality of life.The development of immunotherapies represents a major advancement, harnessing the patient’s own immune system to target and destroy cancer cells. Drugs like Keytruda (pembrolizumab), an immune checkpoint inhibitor that helps T cells recognize and attack cancer cells by blocking the PD-1/PD-L1 pathway, exemplify this progress. As highlighted by the Cancer Research Institute, Keytruda recently achieved its 40th FDA approval, making it available for treating 18 distinct types of cancer. This success underscores the critical role of sustained research, significantly supported by federal funding through agencies like the National Institutes of Health (NIH) and its National Cancer Institute (NCI). This funding supports foundational research and clinical trials at institutions nationwide, laying the groundwork for breakthroughs that fuel the pipeline for new therapies. While precise future projections are complex, the advancements driven by immunotherapy and ongoing research into areas like cancer vaccines and precision medicine offer a positive outlook, contributing to a 34% decline in cancer mortality between 1991 and 2022 and holding the potential to extend and save many more lives in the coming decade.
Have you ever wondered whether your car could be hacked? Nowadays, cars have computers built into them, and some of those are connected to the internet. Doesn’t that make them vulnerable? In the mid 2000s, a team of computer scientists in Tadayoshi Kohno’s group at the University of Washington and in Stefan Savage’s group at the University of California, San Diego, began thinking about this problem. Supported by funding from the National Science Foundation, they started working methodically to try to break into a car electronically (well, two cars, actually, one at each university), to probe whether someone would be able to do so.Modern-day cars have tens of computers built into them, built and programmed by many different companies, and assembled into a highly complex system. The team found that all the different manufacturers involved in producing the car’s computer systems had not yet worked together on protecting the cars from cyber-attacks. That left little gaps that the team was able to exploit. They learnt more and more, until they found that they were able to remotely take over the entire car. Turn its radio on or off, disable the brakes, use the car’s microphone to eavesdrop on conversations in it, drive the car, and on and on. Full control. Not just one car – they had developed the know-how to remotely have full control over millions of cars.Sounds scary, doesn’t it? Before publishing their work, they quietly approached car manufacturers and told them about the problem. After all, the team’s goal was not to take over the nation’s fleet of cars. Their goal was to help improve car safety. Their work led to major new awareness and efforts across the industry to work on systematic cybersecurity, and cars today are far safer. No, your car can’t be remotely hacked through the internet – thanks to this National Science Foundation-funded effort.
Tens of thousands of people die each year from overdosing on opioid drugs. Since the 1970s, naloxone (more commonly known by its brand name, Narcan) has existed as an effective, rapid treatment to reverse opioid overdoses. However, overdoses would often prove fatal before trained medics could administer the life saving drug. The National Institude of Drug Abuse partnered with the private sector to develop a nasal spray that could be pre-packaged and required minimal expertise to administer. Naloxone nasal spray is commercially available and continues to save lives across the nation.
In the late 1970s, diarrhea was causing dehydration and millions of deaths per year in children under five. Since then, the number of such deaths has been reduced by more than 80%, in large part thanks to the simple idea that drinking a combination of sugar, salt, and potassium mixed in water can save a young child from dying from dehydration.When physician-scientists funded by the NIH, CDC, and USAID arrived in Pakistan in the 1960s during a cholera outbreak, the treatment of choice to reduce deaths from cholera-induced diarrhea was intraveneous fluid delivery. However, access to intraveneous intervention was very limited in practice. Patients unable to get to a hospital faced a grim 30 to 40 percent mortality rate. Many alternatives had been tried, including treating dehydrated patients with a variety of oral solutions, from carrot soup to coconut milk. But the best treatment remained unknown.With this as a backdrop, U.S. scientists David Sachar and Norbert Hirschhorn, who were on assignment in Pakistan, began working to understand what was happening in the gut cells when treated with a combination of glucose and sodium. Using a method developed for measuring potentials across frog skin, they experimented with various possibilities, and were eventually able to show that a mixture of sugar, salt, and potassium, dissolved in water, allowed for enhanced salt absorption. Salt is an essential element for rehydration. From there, they began giving patients this mixture orally and they found that it led to a very clear and substantial drop in diarrhea symptoms.Since that first study, this simple, cheap treatment has saved millions of lives.
This example was suggested to our writers by Edward Witten of the Institute for Advanced Study.One of the surprising ways pure mathematics has shaped modern technology is through something called topological quantum field theory (TQFT). Originally developed to study the properties of shapes and spaces, TQFT focuses not on precise measurements like lengths and angles, but on broader questions like: Does this object have a hole? How many holes? If you stretch or bend it without tearing, does it stay the same? These large-scale, or topological, features turn out to be incredibly stable — resistant to noise and small imperfections — and physicists realized that the same ideas could apply to certain exotic materials.This realization led to the discovery of topological superconductors. These materials behave like normal superconductors deep inside, but along their edges or around defects, they host strange new types of quantum states. In particular, they can support Majorana fermions — exotic particles that are their own antiparticles. What makes Majorana fermions special is that they are not localized to a single point: they are spread out over a region, with two Majorana “halves” often living at opposite ends of a material. Because of this non-local nature, if you poke or disturb one end, you can’t easily change the overall state. The information they carry is tied to the topology — the “global” structure — rather than anything local. This makes them remarkably stable against the kinds of errors that plague ordinary quantum bits (qubits).Today, scientists are racing to harness these properties to build topological quantum computers. In these systems, information would be stored not in fragile states that can be disrupted by every tiny bump or vibration, but in the “braids” formed by moving Majorana fermions around each other. Because these braids depend only on the order and structure of the paths taken — not on the exact timing or microscopic details — they offer a way to perform quantum computations that are naturally protected from errors. In fact, as of 2025, scientists and engineers have been able to design chips that can store and measure 8 qubits, with the ability to scale to a million qubits over time.Quantum computer operating on million qubit chips could solve many problems that classical computers can not, which would completely revolutionize our daily lives. For example, they could simulate the behavior of complex molecules and quantum materials in ways that current supercomputers cannot — revolutionizing drug discovery, catalyst design, and materials science. They could also solve certain types of optimization problems exponentially faster, such as improving logistics and scheduling, or cracking encryption schemes based on number theory — a prospect that has national security implications.
Human beings are composed of trillions of cells, each of which are powered by small biological machines called proteins. Almost all diseases, disorders, and infections that humanity faces can ultimately stem back to an issue with a specific type of protein, or an organism (bacteria, parasite, etc…) that relies on the function of specific proteins. For this reason, essentially all drugs and treatments consist of small molecules or physical interventions that act by influencing particular proteins. It is important to understand the nature and shapes of proteins before we can identify candidate proteins and molecules for targeting with new drug research.For more than half a century, a massive database called the Protein Data Bank (PDB) has existed thanks to federal funding from the National Science Foundation and the National Institute of Health. Originally created at Brookhaven National Lab, this database currently contains about 230,000 entries that show the physical structure of various proteins and molecules, with around 10,000 new structures added each year. The database is open-access and we encourage readers to “explore the proteins” that make our bodies work (such as the nanoscale motor we use to generate energy, complete with motor, axle, stator, and generator)Although, drug development often involves private sector research, incentivized by earning returns on their product, this research relies on these basic protein structures to get started. Between 2010 and 2016, 88% of new FDA approved molecules targeted proteins that had been deposited on PDB for free access, often a decade earlier, as a result of federally funded non-proprietary research. Indeed, every single new FDA approved drug in this timespan relied to some extent on previous NIH-funded research. Recent analyses suggest that the value of the time spent using PDB equates to about $5.5 billion dollars each year, which is 800 times more than its operating cost. A conservative estimate of its impact on society due to publicly funded research alone is $1.1 billion dollars a year.This database has helped research on proteins and molecules involved in cancer and cancer immunotherapy, lipid storage diseases, diabetes, muscular dystrophy, epilepsy, cystic fibrosis, cardiovascular disease, antibiotics, living fuel cells, antitoxins, coronavirus, plastic decomposition, genetics, Alzheimer’s, depression, obesity, vision, oral health, and many more. Data from PDB was also vital in powering recent AI advances in predicting protein structures, accelerating future drug design. You can see a full list of all the diseases with associated structures on PDB here.
Around 1 millions people in the USA live with Parkinson’s Disease, with 90,0000 new people diagnosed each year, usually past the age of 50. This disorder causes uncontrollable movement, associated with tremors, difficulty walking, and speech problems. The current gold standard treatment, L-dopa, was first applied with significant benefits to patients at a the Brookhaven National Lab, by George Cotzias in the 1960s. Although the drug had been known to provide therapeutic effects prior to this, the reduction in symptoms was mild and negative side-effects prevented its widespread adoption. Cotzias was the first to show that high doses substantially improved the reduction in symptoms, and avoided side effects by gradually ramping up dose size. He was the first to report its practical use in treating Parkinson’s at a government-run hospital.
What kind of diagrams in a textbook help kids learn the best? Maybe it’s diagrams full of rich texture and detail, that draw the kids’ attention to the diagrams. Or maybe it’s clean, simple diagrams that don’t distract the kids but let them get to the essential idea right away.It turns out that the answer depends on the age of the kids looking at the diagram. Education researchers including David Menendez in the group of Prof. Martha Alibali at the University of Wisconsin Madison decided to test what kind of diagrams would best help kids learn that ladybugs go through metamorphosis (i.e., change their bodies, from being larvae to pupae to the cute bugs that fly in through our windows). They chose metamorphosis because it is a tricky concept that they could use with both younger (1st and 2nd graders) and older kids (4th and 5th graders).The younger kids learnt best with richer, colorful diagrams. In contrast, the older kids learnt about ladybugs just as well with either richer or simpler black-and-white line diagrams – but they could generalize the idea of metamorphosis to other species better after the simpler line diagrams. In sum, the simple line diagrams were better for the older kids while the richer colorful diagrams were better for the younger kids.Teaching well is full of nuance and depends on many things! With this federally-funded efort, the researchers are figuring out how to help kids at each age learn the best.
COVID-19, caused by the SARS-CoV-2 virus, is a respiratory illness that led to a global pandemic, resulting in widespread sickness, hospitalizations, and millions of deaths worldwide. The virus can also lead to long-term health problems known as “long COVID.” Getting vaccinated against COVID-19 became incredibly important because the vaccines were highly effective at preventing severe illness, reducing the need for hospitalization, and lowering the chances of death. Vaccination also played a crucial role in slowing the spread of the virus within communities and protecting vulnerable populations. The rapid development and deployment of these vaccines were critical tools in managing the pandemic and helping societies return to a sense of normalcy.The groundbreaking science behind the COVID-19 vaccines heavily relied on years of federally funded research. For instance, the work of Katalin Karikó and Drew Weissman, who discovered how to modify mRNA to make it a viable and safe vaccine technology, was supported by government grants over many years. Similarly, Jason McLellan and his team’s crucial discovery of how to stabilize the spike protein of the coronavirus, a key component used in many COVID-19 vaccines, also benefited significantly from federal funding, including from the National Institutes of Health. These long-term public investments in fundamental scientific research laid the essential groundwork that allowed scientists to develop life-saving vaccines with unprecedented speed once the pandemic emerged.
Exoskeletons are wearable devices that assist people with mobility challenges, such as those recovering from strokes or spinal cord injuries, or individuals with cerebral palsy. These devices support the body and can help individuals walk, stand, and perform daily activities.Publicly funded research has been instrumental in developing exoskeleton technology. For instance, the U.S. Department of Defense supported the early development of the Berkeley Lower Extremity Exoskeleton (BLEEX) through DARPA funding. This project, led by Professor Homayoon Kazerooni at University of California Berkeley, aimed to assist soldiers in carrying heavy loads. The success of BLEEX led Dr. Kazerooni’s team to the creation of medical exoskeletons like Ekso and Phoenix, which have helped individuals with paralysis to walk, stand, and speak face to face with peers in an upright position.The National Institutes of Health (NIH) and the National Science Foundation (NSF) have also played key roles in funding exoskeleton research. Researchers from the NIH Clinical Center Rehabilitation Medicine Department created the first robotic exoskeleton specifically designed to treat crouch (or flexed-knee) gait in children with cerebral palsy. At Georgia Tech, researchers received an NIH grant to develop smart exoskeletons that adjust to each person’s walking style after a stroke. At Penn State, an NSF grant helped improve the comfort and fit of robotic walking devices. Additionally, at Northern Arizona University (NAU), Professor Zach Lerner has been awarded grants from both NIH and NSF to develop exoskeletons aimed at improving mobility for children with cerebral palsy. These projects are just some of many examples highlighting how federally funded research at U.S. universities has significantly advanced the development of exoskeletons for medical use, particularly in rehabilitation and mobility assistance.
In 1957, Dr. Paul Siegel started breeding chickens at Virginia Tech. His goal? Simple but bold: see what would happen if you selectively bred for body weight, generation after generation. He wasn’t looking for headlines—just following his curiosity, one clutch of chicks at a time.Fast forward more than 65 years: Siegel’s chickens have become one of the longest-running animal research lines in the world. The results are astonishing. The chickens bred for high weight now grow to nearly twelve times the size of their leaner counterparts. That dramatic contrast has become a living, feathered archive—used by researchers across disciplines to study genetics, growth, metabolism, and immune response.These chickens helped revolutionize poultry farming, too. Thanks to advances built on work like Siegel’s, today’s broiler chickens reach market weight in about six weeks—half the time it took in the 1950s. That means more efficient food production and less environmental impact. Not bad for a backyard bird.But perhaps the coolest part? This wasn’t a flashy, million-dollar moonshot. It was methodical, government-funded research—supported by USDA and National Science Foundation grants, and driven by deep scientific curiosity. And it ended up benefiting not just agriculture, but medicine, genetics, and climate science, too.In 2023, Siegel received the Golden Goose Award, which honors quirky, curiosity-driven research that ends up changing the world. Turns out, when you follow the science—even when it starts with chickens—you never know just how far it’ll fly.
You’ve seen them on TV. Maybe you or one of your loved ones has even had to use one: external defibrillators, electronic devices that can apply enough electricity to the chest to restart the heart. Their use is very dramatic (that’s why they’re on TV so much), and no guarantee of success. But they often work, and save many lives. Defibrillators are a mainstay of modern-day EMT equipment; publicly accessible defibrillators, like the one in the picture, even when used by bystanders, save thousands of lives each year. The technology has gone through many iterations over the last few decades. But how did they start? Who came up with the apparently crazy idea to begin with?The first doctor to imagine the possibility that external electrical stimulation could perhaps restart a stopped heart was Dr. Paul Zoll at Harvard Medical School in the late 1940s. With funding from the National Institutes of Health, he carried out substantial research into the practicality and advancement of external defibrillation technology. Dr. Zoll was the first person to successfully save someone’s life by applying an external shock (meaning, without having to perform surgery to access the heart). Many others took on and extended his work from there. We would not have the highly accessible, effective defibrillators that save so many lives today without Zoll’s pioneering work. TV shows wouldn’t be full of doctors shouting “clear!” And it was all made possible by federal funding.
In the early 1970s in Dallas, Drs. Joseph Goldstein and Michael Brown began a fruitful and decades-long collaboration that ended up transforming our understanding of heart disease. The story revolves around low-density lipoprotein (LDL), a particle of lipids and proteins that transports cholesterol around our bodies. Goldstein, Brown, and their team discovered that human cells rely on a specialized type of sensor, called an LDL receptor, to remove LDL from the blood. While cholesterol is an essential building block of many body processes, without properly functioning receptors, LDL accumulates, clogging arteries and increasing the risk of heart attacks. Working with cells from patients who lacked the gene to make LDL receptors, the team learned that the function of these receptors was critical for removing excess cholesterol from the blood circulating through the body. This understanding was a stepping stone for the development of life-saving cholesterol-lowering drugs. For example, statins, which are one type of those widely used drugs, work by lowering circulating LDL. Goldstein and Brown’s findings were so pivotal that they earned the Nobel Prize in Physiology or Medicine in 1985.But how did the team know to care about cholesterol in the first place? In 1948, the National Heart Institute (now the National Heart, Lung, and Blood Institute, or NHLBI) launched the Framingham Heart Study in Massachusetts. This large, very long-term study followed thousands of participants over decades, allowing the scientists to discern patterns that emerge only over the long term. The Framingham Heart Study introduced the concept of “risk factors.” One of their key findings was identifying high blood cholesterol as a major risk factor for heart disease. This was the impetus that led Goldstein and Brown to focus on cholesterol. The Framingham Heart Study refined our understanding of “good” and “bad” cholesterol, and more broadly, laid the foundation for modern preventive cardiology.Thanks to decades of work into the fundamental understanding of cholesterol metabolism, knowledge that began in Framingham, Massachusetts found its way to a lab in Dallas, Texas and became a cornerstone of cardiovascular medicine—drastically reducing deaths from coronary artery disease, the world’s leading killer.
Did you know that a doctor once helped make our homes and playgrounds much safer—and it all started with baby teeth? Dr. Herbert Needleman, a child psychiatrist and researcher, made a shocking discovery: even small amounts of lead could hurt kids’ brains and lower their IQs. But he didn’t do it alone. The U.S. government, through the National Institutes of Health (NIH), gave him the money and support he needed to run his studies. This federal funding helped Dr. Needleman collect thousands of teeth from schoolchildren and test them for lead. Then, he compared the lead levels to the kids’ IQ scores—and the results were clear. The more lead in their teeth, the lower their scores.Before Dr. Needleman’s research, people thought kids had to be seriously poisoned by lead before it caused any problems. But his studies, made possible by NIH grants, showed that even tiny amounts—too small to make kids obviously sick—were still damaging their brains. The federal support allowed him to do long-term studies, use advanced lab tests, and work with experts across the country. These tools helped prove something many people didn’t want to believe: low-level lead exposure was silently stealing children’s potential. Because of this work, the public finally understood how dangerous lead in things like paint and gasoline really was.Thanks to this research, huge changes were made. The government took lead out of gas, banned lead paint in homes, and made neighborhoods safer for kids. As a result, children’s lead levels dropped dramatically, and average IQ scores in the U.S. actually went up. Dr. Needleman’s discovery, powered by NIH support, not only changed science—it helped protect millions of kids from harm. It’s a powerful example of how funding scientific research can lead to major improvements in health and safety for everyone.
Some innovative technological advances have come from third party companies, rather than directly from federally funded research, but did you know many of these companies actually got their start thanks to federal funding? Every agency of the government with a budget for R&D participates in a program call the “Small Business Innovation Research” (SBIR) program, which provides targeted grants towards jumpstarting entrepeneurs with big ideas. In the 1980s, the company Qualcomm was a small business with a burgeoning interest in satellite communications. SBIR grants provided it with its first funding that enabled them to develop the technology that would transform the world of wireless communications, eventually giving us standards like 5G and 4G. Qualcomm has also provided vital communication technology for transportation services like trucking. As of 2023, Qualcomm is ranked 3rd in the world for number of patents. Thanks to the governments initial investment, Qualcomm now employs ~50,000 people and has a revenue of ~40 billion dollars, of which two-thirds comes from the department that received the initial SBIR grants.
Nearly anyone can get access to a 3D printer, opening up a world of invention and innovation. However, prior to the invention of 3D printing, building complex or irregular machinery was extremely time consuming and required carefully honed skills.In order to 3D print something, there needs to be a design for the 3D printer to follow. In the ‘70s, the U.S. National Science Foundation funded the creation of 3D modeling computer programs, which would eventually develop into Computer Aided Design, which printing technologies still use today.The Strategic Manufacturing Initiative (STRATMAN) was founded by the U.S. National Science Foundation in the ‘80s, working to continue supporting the fledgling field of 3D printing. The earliest technology to come out of this used liquid polymers that hardened when exposed to light, turning into solid plastic.But this wasn’t the end of the story. How could one print with metals or other polymers besides plastic? With funding from a STRATMAN grant, this was solved with Selective Laser Sintering, in which lasers melt layers of powder (metal or polymers) to create the desired shape. Thanks to the U.S. National Science Foundation, there are a variety of methods for 3D printing that can use diverse materials and be used for nearly any design.
How was Apple’s AI assistant Siri born? In order to create more efficient command chain decisions, researchers at the Department of Defense began working in the early 2000s on a project to reduce the need for large command structures. They reasoned that having a virtual assistant that could translate as well as assist in personalized decision-making could make structures more lightweight. The funding for this project largely went into project CALO (Cognitive Assistant that Learns and Optimizes). CALO eventually grew beyond work at the Department of Defense, and became a massive collaboration led by SRI International (a child organization of Stanford Research Institute). It included researchers from Carnegie Mellon University, the University of Massachusetts, and the Institute for Human and Machine Cognition, among others.Together, these researchers worked together to create a state-of-the-art AI system. An offshoot of this AI was later commercialized as Siri Inc., which was purchased by Apple, and became the virtual assistant that many now use in their day to day lives.
Today there are millions of video games in existence. But how did we get here? Before there was Fortnite, Minecraft, and Call of Duty, there was Tennis for Two. Invented by William Higinbotham in the Brookhaven National Laboratory in 1958, it was a simple game that used an oscilloscope to display a side view of a tennis court. Players used the buttons and rotating dials of the device to control an invisible tennis racket. Despite using an analog computer, the ball, a small moving dot, moved in a surprisingly realistic path, taking into account wind resistance as well as gravity.Higinbotham created this game not for profit, but to spread scientific passion. He recognized that most science exhibits were static, and sought to create a more interactive station for the lab’s annual exhibition. It quickly grew popular, with hundreds of visitors waiting in line for a chance to play. This game eventually inspired the creation of pong in 1972, leading to the modern era of video games with multiplayer, hyperrealistic, and even virtual reality options.
“Your Child has brain Cancer.”Thousands of parents hear those words every year. For some, the diagnosis is even more devastating, when it is Diffuse Intrinsic Pontine Glioma (DIPG). This is a particularly aggressive and inoperable brain tumor that strikes exclusively in children. DIPG attacks the brainstem- the control center of the brain. This region controls very basic functions, like breathing, or the child’s heartbeat. Removing the tumor is difficult and sometimes not an option. Radiation can slow the growth of the tumor, but there is no real treatment. No cure. The survival rate is near zero.But federally funded researchers at Stanford University are rewriting the future of DIPG.A new way to fight DIPG.Dr. Michelle Monje, a federally funded researcher at Stanford, has spent years studying how DIPG tumors grow. A DIPG tumor does not form a neat mass. Instead, it blends into the healthy brainstem tissue, entwining itself with the very cells that help a child breathe, swallow and live. This close integration were thought to make DIPG untreatable. However, one of the most important discoveries of Monje’s team was that the tumor cells weren’t just blending in: they were communicating with early born non-neuronal cells in the brain.This discovery opened up a daring new line of attack to treat DIPG. If the communication is important for the cancer growth, then, if they could interrupt the cellular “conversation”, it might be possible to treat the cancer.The CAR T-cell therapy approach— and a stunning Result. Monje’s team turned to CAR T-cell therapy, a cutting-edge treatment that trains a patient’s own immune cells to recognize and destroy cancer cells. It had never been used for DIPG. No one knew if it would work. They thought it would slow down the cancer. But what happened next exceeded every expectation!One patient was completely cured.Several others had tumors that shrank significantly in size.It is important to note that not all patients responded to CAR T cell therapy- some saw no measurable improvement, highlighting the promise of the therapy but also the urgent need for continued work.This was the first time such dramatic responses were seen in DIPG patients. The results mark a turning point in the decades-long search for a treatment. Some patients finally have hope. Rapid progress to find cures for childhood brain cancers like DIPG heavily rely on federal funding from the National Institute of Health. Government grants helped Dr. Monje, and others explore high-risk, high-reward research that wouldn’t happen otherwise. Continued investment is essential for finding treatements for rare childhood brain cancers like DIPG.
In the Fall of 2024, a baby in Philadelphia was born with a rare genetic disorder that would usually lead to severe mental and developmental delays, and in most cases death. All due to a mutation in a gene, CPS1, that helps the liver break down the ammonia that naturally occurs in the body. Without the ability to break down ammonia, most children with this mutation die within the first week. But doctors and scientists gave this baby, and hopefully many others to follow, a chance at life when they took on the challenge of developing a custom therapy to edit and fix the faulty CPS1 gene.To do this, scientists developed a technique using an enzyme called CRISPR-Cas9. In 2012, Jennifer Doudna at U.C. Berkeley and Emmanuelle Charpentier, who was then at Umeå University in Sweden, discovered that CRISPR-Cas9 can be targeted to a specific, chosen, location in the genome, and used to edit that location. For this baby, scientists targeted the location in the genome where CPS1 is found, with the goal of correcting the mutation in the baby’s gene. The scientists wrapped the CRISPR-Cas9 enzyme in lipid molecules to protect it from degradation in the blood stream. In other words, it was a tiny little slippery ball of protective fats, surrounding a gene editing machine made specifically for this baby. The idea was that once this technologically advanced little bundle reached the liver, CRISPR-Cas9 would be able to find and recognize the mutation in CPS1, and then fix it.With the baby kept alive in the hospital through a very strictly controlled food and medicine regimen and constant intensive care monitoring, the doctors raced against time. Astonishingly, they were able to use CRISPR-Cas9 to go from a diagnosis to a completely new custom treatment in only 6 months - remarkable speed for such a development. As soon as the doctors could, they gave the baby the first of what would be three separate infusions of the lipid-wrapped microballs of CRISPR-Cas9. By 9 months, the treatment was working and the baby had near-normal ammonia levels. The baby was cleared to go home with his parents, and the hospital was able to discharge him.A baby’s life, saved. A family, made whole.More than 30 million people in the United States have a similarly rare genetic disease. It was thought that most of these would never have a gene-therapy treatment created, due to the high cost of developing each treatment, and low number of patients per disease. However, thanks to decades of federal funding and a group of doctors willing to try, the scientists forged a new path for companies to develop personalized treatments for a fraction of the previous cost. They saved one baby’s life, and opened the door to saving and helping many, many more.
Chances are, you or someone you know may currently have a streak on Duolingo. But did you know that the world’s most popular language-learning app got its start thanks to federal funding? It all began at Carneige Mellon University, where computer science professor Luis von Ahn and his graduate student, Severin Hacker wanted to work at the interface of language education and improving automated translation software on internet websites. Originally, Duolingo was designed so that while users were learning languages, they were also helping to translate real web content, like Wikipedia articles and news stories. This clever idea meant people were learning and helping the internet become more accessible at the same time. Federal funding from the National Science Foundation (NSF) gave the team the time and resources to build and test this unique approach, combining education and crowdsourced translation in a way that had never been done before.Today, Duolingo has millions of users learning everything from Spanish to Swahili for free, and also offers an affordable English Language Learning certificate. The app has evolved beyond translations, now using artificial intelligence and gamified lessons to help people all over the world reach their language goals. It’s a perfect example of how federal funding in science and education can lead to powerful, world-changing tools.
Imagine curing type 1 diabetes—not with daily shots, but by replacing the patient’ cells that should make insulin with those that actually do. That’s exactly what scientists are doing with a breakthrough treatment that puts lab-grown insulin-producing cells into the body. The researchers are using stem cells, which unlike most cells, have not yet matured into the final fixed type of cell they will be (like hair, or skin, or muscle). Instead, stem cells can in principle still turn into any type of cell. In this case, the scientists are coaxing stem cells into becoming insulin-producing cells, like those found in the pancreas. In early trials with these new stem cells, people started making their own insulin again—one even ditched insulin shots completely. Blood sugar levels improved fast and stayed steady. It’s real progress, and it’s happening now.There’s still a catch: patients need meds to stop their immune systems from attacking the newly transplanted cells. But researchers are already working on ways to fix that too—like building tiny shields around the cells. A key step will be to create a more natural environment for the stem cells. This would improve their integration and longevity, offering hope for more effective and lasting treatments for type 1 diabetes. More work is needed, but this breakthrough shows how bold ideas and smart science are changing lives—and bringing us closer to a cure..
Ironing, which used to be a weekly and time-consuming chore, has been on a sharp decline worldwide, in large part due to wrinkle-free cotton. (The trend predates the rising popularity of athleisure.) “Wrinkle-free cotton”… it sounds great, as if it just magically avoids wrinkles, doesn’t it? But how did the world get wrinkle-free cotton? The story traces its beginnings to the 1950s and a federally funded research center in Louisiana. There, United States Department of Agriculture chemist Ruth Benerito led a team that figured out how to wrangle the cotton fibers in cloth so that they resist wrinkles after getting wet.Benerito was one of just two women who were allowed to take physical chemistry classes at Tulane University. Those classes led to her earning her doctorate from the University of Chicago in 1948 and eventually to the creation of wrinkle resistant cotton.The process of creating “wash and wear” fabrics is like what happens when chemical relaxers are used to make curly hair straight. Cotton is a polymer—a very long molecule that can be thought of like a chain—and its strands tend to kink up after getting wet, leading to wrinkled shirts, sheets, pillowcases, and more.In the laboratories at the United States Department of Agriculture’s Southern Regional Research Center, Benerito discovered that washing the cotton fibers with a specific type of acid that includes one chlorine atom strengthened the connections between the polymer’s strands. That extra strength helps them resist being bent into kinks. Chemical relaxers for hair only last until new hair grows in, but the acid wash that Benerito discovered transformed cotton clothing and household linens from being wrinkle prone to being permanently wrinkle resistant. Of course that made cotton a much more attractive fabric for consumers than it would be otherwise. Not only did this federally funded discovery save millions from the tedium of standing at an ironing board for hours each week, but it also saved the American cotton industry!
Kidneys keep people alive by removing waste products and controlling the levels of important minerals in our blood. Before 1960, kidney failure was essentially fatal. Acute dialysis, the process of artificially doing the kidney’s job for it, could only keep patients alive temporarily. In 1960, Belding Scribner, Wayne Quinton, and David Dillard developed a new way to access the blood, the Scribner shunt, that enabled chronic dialysis over the course of years. This process could now keep patients alive for years or even decades after kidney failure. Scribner sought to get his technology to as many patients in need as possible, and together with James Haviland he created the world’s first outpatient dialysis clinic in 1962, now known as the Northwest Kidney Centers.Providing patients with chronic dialysis is very expensive, and Scribner couldn’t obtain the necessary funds from philanthropic organizations. Fortunately, the federal government saw the potential of chronic dialysis and stepped in to support the clinic until they found additional funding. Because treatment was still expensive and limited, a committee was also formed to determine on an anonymous basis which patients would be admitted, serving as one of the first ever bioethics committees. Scribner continued to fight for more accessible treatment for the nation, and helped enact legislation that would provide Medicare support for dialysis.Given Scribner’s success, the government proceeded to develop dialysis availability around the nation. Under Scribner’s direction, The Department of Veterans Affairs quickly established a large-scale dialysis program in 30 units across their system of hospitals. The Public Health Service provided similar funding to the Downstate Medical Center in Brooklyn. With Scribner’s advocacy, two research programs were established across multiple NIH institutes, which would fund the majority of early clinical research on dialysis.Today, hundreds of thousands of people with kidney failure are treated with chronic dialysis, extending their lives while they wait for a kidney transplant.