Health and Well-Being · National Security · Prosperity
Federally-funded research gave us...
treatments for melanoma
This site crowd-sources societal benefits
stemming from government-funded research in the United States.
We have
28 examples and counting.

To help astronauts survive long-term missions, NASA started thinking about ways to grow food in space. This required reducing the amount of ethylene in the air. Ethylene is a gas released by plants, and its build-up in a closed environment (like the space station) leads to wilting and accelerated plant decay. NASA funded research at the University of Wisconsin Madison to work on this problem. It turns out that the system the researchers developed to remove ethylene from the air can also be used to remove other organic particles, making it the basis for a reportedly high quality air purifier. The technology was licensed by a private company, and is now used to build home and commercial air purifiers.
If you heard that the National Institutes of Health was funding research on lizards, you might at first think that the N.I.H. was wasting taxpayer money. But in fact, a researcher at the Veteran’s Affair’s Medical Center in the Bronx discovered that the venom of Gila monsters (lizards from the Southwest of the U.S.A.) is an excellent promoter of insulin, the hormone that our bodies use to regulate sugar levels. Further work from a researcher at the National Institute on Aging helped take that discovery to the clinic and to production of a drug now widely used to combat type 2 diabetes. Thanks to government-funded scientists and their research, Gila monsters are saving lives.
Cancer is a devastating disease. Each year, approximately two million people are diagnosed with cancer, with half a million people dying from cancer-related causes. Of these, late stage melanoma is one of the deadliest. However, funded by the National Institutes of Health, basic research into immunology in the 1990s by James Allison (who was then at UC Berkeley, and is now at the M.D. Anderson Cancer Center in Houston, TX) led to the development of the drug Ipilimumab.Our immune system normally protects us from invaders (like bacteria and viruses) by attacking the invaders. Cancer is also an invader, but many cancers manage to block the immune system attack. Allison discovered how to prevent that blockage, thus unleashing the immune system on cancerous tumor cells. This is the basis for how ipilimumab works. Ipilimumab has substantially improved life expectancy and lowered death rates for individuals with a poor prognosis in melanoma. Astonishingly, a subset of individuals even achieved complete elimination of their cancer, something previously thought impossible in late stage melanoma.For his impact on immunology and the treatment of melanoma, James Allison (a “blues-loving scientist from the small town of Alice, Texas”) was awarded the 2018 Nobel Prize in Physiology or Medicine. With this research, the N.I.H. helped launch a new era of cancer immunotherapy and ongoing development of many new drugs that seek to harness our immune systems to attack malignant tumors.
The Internet is fundamental to many aspects of our lives (communications; commerce; entertainment; etc.) It began as the ARPANET, a project funded by the Advanced Research Projects Agency of the Department of Defense, and as NSFNET, funded by the National Science Foundation. This government-funded work, built initially to connect researchers to each other, now connects the entire world and powers our economy.
Before touchscreens became the standard form of interaction with our phones, devices, and electronics, they were the Ph.D. thesis project of a graduate student at the University of Delaware, who was funded by a scholarship from the National Science Foundation. Wayne Westerman’s dissertation developed the multi-touch capability that was necessary for touchscreens to take off as a commercial product. The company that he and his Ph.D. adviser started was purchased by Apple, incorporated into the iPhone, and the rest is history.
Postpartum depression (PPD) affects 1 in 8 women and is a primary cause of maternal mortality following childbirth. Until recently there was no treatment specifically for PPD, with standard antidepressants proving ineffective and too slow. Using funding from the NIH, basic research into a hormone called allopregnanolone revealed its role in regulating specific types of neurons in the brain associated with PPD. This research eventually led to the first ever FDA-approved drug for treating PPD, brexanolone. This drug is not only highly effective in comparison to standard antidepressants, but importantly takes effect within hours instead of the multiple weeks associated with standard treatments. Thanks to the basic research identified and supported by the NIH, we now have an effective treatment for PPD.
Today, GPS is engrained in our lives, from figuring out the best route home after a road closure to making sure a loved one reached their destination safely. The modern system, which now consists of 31 satellites orbiting the earth, five ground control stations located around the globe, and many receivers which are found in our phones, cars, etc., began as a project under the Department of Defense in 1973. The work was carried out across several federally funded institutions: the US Naval Research Laboratory, The Johns Hopkins University Applied Physics Laboratory, and The Aerospace Corporation. Fully operational in 1993, the full capabilities of the system became freely accessible across the globe for peaceful purposes in 2000.
Anesthesia, while essential for surgeries, requires careful monitoring. You want the patient to be anesthetized enough that they won’t feel or remember anything. But you need to be careful to not anesthetize them so deeply that the anesthesia itself might cause later complications. For example, overuse of some important anesthetics such as propofol can sink a patient into a state known as “burst suppresion”. If a patient falls into this state during surgery, then after they wake up, they may experience confusion, delirium and memory loss – sometimes for months. We want to avoid that.So, how much is too much propofol? How can the anesthesiologist tell if the patient is getting close to the burst suppression state, so that they can then back off a bit? In research funded by the National Institutes of Health, a team at the Massuchesetts Institute of Technology (MIT) used a mathematical tool known as state-space modeling to analyze the EEGs of patients undergoing anesthesia. They discovered that the strength of fluctuations in the amplitude of a particular type of brain waves is a very good measure of how close a patient is to the burst suppression state. This means that those fluctuations can be used to tell when to back off on the propofol.While this result is too new to have made it into the clinic quite yet, it is very promising: using the new measure that the team discovered, anesthesiologists will be able to better protect their patients.
When a tornado begins to form, high-speed winds several kilometers above the ground begin to move in a tight rotation known as a “Tornado Vortex Signature” (TVS). Detecting this raises a warning about a possible tornado well before the tornado touches down on the ground. But how can we tell this distant TVS is happening? Tornadoes often form in thunderstorms, and therefore carry droplets of rain with them. Which leads us to the secret sauce: radar signals bounce off rain droplets, and how they bounce off the droplets is affected by the motion of the droplets. That can be leveraged into using radar to measure the speed of the droplets, and detect the TVS.How is that leveraging done? In 1973, researchers in Norman, Oklahoma, working for the United States Weather Bureau (which later became part of NOAA) figured out how to do it using a principle known as the Doppler effect. You know how a siren on an ambulance sounds high-pitched if the ambulance is driving towards you, but sounds lower-pitched once the ambulance is past you and is driving away? The very same applies to radar and rain droplets: the radar tower sends out a signal at some frequency. If it bounces off a droplet that is moving away, the radar signal that comes back to the radar tower will have a lower frequency. But if the droplet is moving towards the radar tower, the radar signal that comes back to the tower will have higher frequency. How much higher or lower depends on the droplet’s speed. Using this, radar can be used to measure storm wind speed and direction. When two close-by locations are measured to have very different wind speeds, or even opposing wind directions, that suggests the tight rotation that is a TVS.This principle forms the basis of modern tornado early-warning systems, and helps to keep millions of people safe.
Our sense of vision relies on many different genes functioning properly. One of those genes is called RPE65. People who did not inherit a working copy of RPE65 suffer from Leber’s Congenital Amaurosis (LCA): a rare but severe type of vision loss which, tragically, strikes from early childhood.Recently, an approach known as gene therapy has made very important progress. Gene therapy involves injecting working copies of genes in a way that allows the injected working copies to replace malfunctioning versions in the patient. Funded by the National Institutes of Health, researchers at the Children’s Hospital of Pennsylvania created a gene therapy treatment for RPE65. Their method was then further developed into the commercially-available treatment Luxturna. It is one of the first gene therapies approved by the FDA to treat a disease. This treatment greatly improves vision in people with LCA and, importantly, it is helping to pave the way for more gene therapies, to improve the lives of more people.
Influenza, commonly known as the flu, is a contagious respiratory illness caused by influenza viruses that infect the nose, throat, and sometimes the lungs. It can cause mild to severe illness, and at times can lead to serious complications like pneumonia, or even death. There are four main types of influenza viruses: A, B, C, and D. Influenza A and B viruses are the types that cause the seasonal epidemics of disease in humans almost every winter. Influenza A viruses are further divided into subtypes based on two proteins on the surface of the virus (like H1N1 and H3N2), while influenza B viruses are broken down into specific lineages. These viruses are constantly changing, which is why new flu vaccines are needed each year. In the United States, the flu is a significant public health concern annually, with the Centers for Disease Control and Prevention (CDC) estimating it causes between 9 million and 45 million illnesses, between 140 000 and 710 000 hospitalizations, and between 12 000 and 52 000 deaths each year, depending on the season’s severity. While flu pandemics had occurred before, scientists didn’t actually isolate the virus responsible for human flu until 1933. A British team, including Wilson Smith, Christopher Andrewes, and Patrick Laidlaw working at the National Institute for Medical Research (NIMR) in London, were the first to identify the influenza A virus. Understanding the virus was the first step towards preventing it. The push for a vaccine gained urgency during World War II, as military leaders feared flu outbreaks could cripple the armed forces. The U.S. Army established a specific group, the Army Epidemiological Board’s Commission on Influenza, to tackle this problem. This commission funded and directed research led by Dr. Thomas Francis Jr. (who headed the commission) and his junior colleague Dr. Jonas Salk at the University of Michigan. They developed the first effective inactivated flu vaccine, using killed virus. This vaccine was first tested for safety and effectiveness on U.S. military personnel before being licensed for civilian use in 1945. Today, flu vaccination is recommended annually for most people aged 6 months and older in the United States. Each year, tens of millions of Americans get their flu shot, although vaccination rates vary by age group and season, often hovering around 40-60% overall coverage. While getting vaccinated isn’t a guarantee against catching the flu, it’s the best way to prevent it and its potentially serious complications. The CDC estimates that flu vaccination prevents millions of illnesses and tens of thousands of hospitalizations each year. For example, during the 2019-2020 season alone, flu vaccination was estimated to have prevented about 7.5 million influenza illnesses, 3.7 million flu-associated medical visits, 105 000 hospitalizations, and 6 300 deaths in the U.S. These numbers highlight the significant public health benefit of widespread annual flu vaccination.
Microchips, technically called integrated circuits, are miniature marvels of engineering, typically made from silicon and containing millions or even billions of microscopic electronic components like transistors. These components work together to process data, perform calculations, and control functions, acting as the operational core for nearly all modern electronics. You find them everywhere: in smartphones, computers, televisions, cars, washing machines, sophisticated medical devices, and critical infrastructure. This ubiquity makes the semiconductor industry, responsible for designing and manufacturing microchips, a massive global market worth well over $500 billion annually, underpinning vast sectors of the world’s economy.While Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor invented the first integrated circuits around 1958-1959, U.S. government funding and demand were crucial for turning this invention into a practical technology. Specifically, the U.S. Air Force required highly reliable, small electronics for its Minuteman missile guidance systems in the early 1960s, providing large, vital early orders. Simultaneously, NASA needed advanced, lightweight computers for the Apollo program, contracting MIT’s Instrumentation Laboratory to design the groundbreaking Apollo Guidance Computer, which heavily used early microchips. This military and space agency demand guaranteed a market, funded crucial manufacturing advancements, and drastically lowered costs, paving the way for commercial use.
Many Americans experience discomfort after consuming dairy because they have lactose intolerance, meaning they cannot properly digest lactose, the main sugar in milk. This happens because their small intestine doesn’t produce enough of an enzyme called lactase. Without sufficient lactase, the lactose travels undigested to the colon, leading to gastrointestinal issues. Addressing this common problem, USDA chemist Virginia Harris Holsinger pioneered a solution in the 1980s. She discovered that adding lactase enzyme sourced from non-human origins, such as fungi, directly into milk could effectively break down the problematic lactose. Holsinger’s method works because the added lactase enzyme pre-digests the milk sugar, breaking the lactose down into simpler, easily digestible sugars: glucose and galactose. The resulting milk tastes slightly sweeter than regular milk but provides a way for lactose-intolerant individuals to enjoy dairy without negative symptoms. Her research was successfully commercialized by the company Lactaid. They introduced lactose-free milk to the market and later expanded the concept to offer a variety of other popular dairy products, including ice cream, cottage cheese, sour cream, and even eggnog, suitable for those with lactose intolerance.
Solar panels work by capturing energy from sunlight. They contain many small cells, usually made of silicon, which act like tiny electricity generators. When sunlight (photons) strikes these cells, it knocks electrons loose, creating a flow of direct current (DC) electricity through the photovoltaic effect. Because most homes and businesses use alternating current (AC), an inverter is used to convert the DC electricity into usable AC power. Solar power is incredibly useful because it’s a clean energy source that doesn’t produce greenhouse gas emissions while operating, taps into the renewable power of the sun, and can help lower electricity costs. Following early private inventions, US public institutions significantly advanced solar technology. The Department of Energy (DOE) has funded extensive research, notably establishing the National Renewable Energy Laboratory (NREL) in 1977, which remains dedicated to improving solar efficiency and affordability. Many public universities across the country also conduct vital, often DOE-funded, research in materials and engineering. Key contributors include institutions like Arizona State University, known for its dedicated solar programs, the University of Delaware with its early Institute of Energy Conversion, and major research hubs within the University of California system. This sustained effort, combined with falling prices, has led to explosive growth. In recent years, solar power has consistently accounted for a massive portion, often around half, of all new electricity-generating capacity added to the U.S. grid each year.
CRISPR-Cas9 gene editing is a groundbreaking technology that has had widespread impact on society, from changing the way we combat diseases to improving the foods that we eat. What was originally research into the genetic code of bacteria led to an interesting discovery. Scientists noted strange repeated sequences, which were named clustered regularly interspaced short palindromic repeats, or CRISPR for short. They also discovered that these sequences effectively acted like an immune system, allowing bacteria to “remember” past viral attacks to fight infections in the future. In 2012 biochemists Jennifer Doudna at UC Berkeley and Emmauelle Charpentier showed how to harness CRISPR along with an enzyme known as Cas9 to cut DNA at exact locations. Ever since then, CRISPR-Cas9 gene editing technology has been adapted to cure diseases, improve crops, and aid in our quest to better understand genetics.
Am I pregnant? There have been methods to answer this question dating all the way back to the ancient Egyptians in 1350 BCE. Methods became more advanced over time, but they remained largely unreliable until there was a greater understanding of the hormonal and chemical changes that occur in the female body during pregnancy. In the 1960s and 70s the National Institute of Health become one of the foremost places in the entire world to conduct such research, due to a combination of proper funding, expertise, and the ability to conduct what, in many settings, would be considered very tedious work due to the difficulty of extracting and synthesizing hormones.Such a place would attract Judith Vaitukaitis and Glenn Braunstein, both medical residents in Boston, who would start doing fundamental research to understand human chorionic gonadotropin (hCG), a hormone that was known to be linked to pregnancy and certain types of cancers. Vaitukaitis and Braunstein would come to understand more about the structure of hCG, and subsequently, develop more advanced tools to detect hCG. These findings would lead to the development of the first at home early pregnancy test, which would give women a safe and reliable way to determine pregnancy as early as 8-10 days post conception.
HIV (human immunodeficiency virus) infection, which untreated results in AIDS (acquired immunodeficiency syndrome) has killed over 40 million people worldwide since it was first identified in the early 1980s. While there is still no vaccine to prevent HIV infection, highly effective pre-exposure prophylaxis treatment, or PrEP, has become an important aspect of limiting the spread of HIV since it was approved in the US in 2012 after clinical trials funded through the National Institute of Allergy and Infectious Disease. Born out of research into antiretroviral drugs for treatment, these daily medications are taken by people who are at higher-risk for getting HIV, and have been shown to drastically reduce the chance of infection if exposed to HIV. In the UK, for example, the number of new infections amongst gay men dropped for the first time ever the year PrEP began being perscribed as part of the first trial in that country in 2017.Access to these medications is a key part of the push to eliminating infections and deaths due to HIV across the United States and globally. While new infections worldwide peaked in 2004, HIV/AIDS remains one of the world’s most deadly infectious diseases, particularly across Sub-Saharan Africa. Given just how effective these medications have been, broad access to PrEP has the potential to stop this ongoing global epidemic.
Have you ever used a key-card to open the room to your office or hotel room? Have you used a tap-payment to quickly pay for a coffee or groceries? Radio Frequency Identification (RFID) is the basis for being able to safely and securely use electronic locks and tap-to-use credit cards without worrying that somebody else’s card can also open your office or charge your card. It is in widespread use in offices and stores, and even in more modern home security systems. Developing RFID required ingenuity in figuing out ways to pattern individualized circuits that can be put in a card which, when scanned by the “lock”, gave a personalized identification code which the system can use to figure out if that key had access or not. Importantly it let these cards work without batteries: making them extremely useful in many settings.RFID technology was made possible by research performed in a federal lab, specifically the Los Alamos National Labs, and funded by federal agencies including the Atomic Energy Comission and the Animal and Plant Health Inspection Service of the U.S. Department of Agriculture.
Many of the technologies we have at our fingertips take advantage of huge computer systems that crunch massive amounts of numbers. Take the weather app on your phone: to predict the weather in your home state (and even specifically your city), the weather across the world needs to be recorded and fed into a computer model that can take those millions of numbers and figure out how they will impact whether or not you need to take an umbrella to work today. Similarly, the airplanes we use to travel, and the jets that comprise our air force, make use of computer models that figure out how the air at high velocities will impact how bumpy your ride is. The ability to run these computer models is thanks to supercomputers: enourmous Costco-sized buildings full of stacks of computers that work together to process all of that data and transmit the results out to the world-and eventually your phone. Federal funding has a long history of supporting the development of the supercomputer, from the DARPA funded “ILLIAC IV” that was developed at the University of Illinois Urbana Champaign in 1964, through the High Performance Computing Modernization Plan (HPCMP) of the 1990’s co-led by NSF and DARPA, up to modern day continuation of these programs involving NSF, DARPA, DOE, and multiple national labs. These efforts paved the way not just for the computers housed and used federally, but many of the same technology runs the servers that are used in private companies to house your email, connect you to social media, and many more.
In the early 2000s, the NSF funded Alvin Roth, Tayfun Sonmez, and M. Utku Unver in applying economic theory to the problem of kidney donations. At the time, extensive waiting queues prevented tens of thousands of patients from obtaining the new kidneys they needed to survive. Roth and colleagues recognized a similarity between the problem of incombatible donors (when a patient’s immune system is incompatible with a donor’s) to existing theories described by LLoyd Shapley about efficiently matching pairs of people to each other based on their preferences.This research led to the development of a large-scale kidney donation program, which allows individuals all over the nation to donate to one another in complex exchanges, while simultaneously retaining the incentives that drive people to donate to their relatives in need. Their framework specifically excludes the consideration of monetary incentive, which proved vital given USA laws that forbid the exchange of body parts for money. Their program has since been adopted to enable national kidney exchanges and continues to save lives by shortening kidney donation waiting times. For their work, Alvin Roth and LLoyd Shapley shared the 2012 Nobel Prize in Economics.
In the 1930s and 40s, physicists at American universities, including Isidor Rabi at Columbia and teams at Harvard and Stanford, discovered and refined Nuclear Magnetic Resonance (NMR) through government-supported research. Initially used by chemists to analyze molecular structures, the technology’s potential for medical applications remained unexplored for decades. The breakthrough came in the 1970s when Paul Lauterbur, working at the federally funded Stony Brook University, discovered how to generate spatial images from NMR signals. This transformative federally supported research converted a physics lab technique into Magnetic Resonance Imaging (MRI), revolutionizing medical diagnostics without using radiation and earning Lauterbur the 2003 Nobel Prize in Medicine.Today, MRI technology provides safe, radiation-free imaging that reveals details of soft tissues invisible to other technologies. With approximately 40 million MRI scans performed annually in the United States, the technology helps diagnose conditions from brain tumors and stroke damage to torn ligaments and cardiac problems. This federally funded basic science research has transformed into an essential medical tool that guides surgical planning, monitors disease progression, and evaluates treatment effectiveness, directly improving outcomes for millions of patients worldwide.
For modern robots, like self-driving cars, successfully navigating our complex world is a fundamental requirement. It’s not enough for them just to follow a pre-programmed path. They need to perform two critical tasks simultaneously: figuring out their precise location (“Where am I?”) while also building and updating a detailed map of their immediate environment (“What’s around me?”). This challenging process is called Simultaneous Localization And Mapping, or SLAM.Key developments enabling SLAM technology originated from research supported by agencies like NASA, initially envisioned for robots exploring unknown terrains autonomously. Today, SLAM forms the foundational software enabling many robots to continuously learn about their surroundings, accurately track their own position within that space, and intelligently plan paths towards their goals. Crucially, this allows them to safely maneuver around both stationary objects and unexpected obstacles that might suddenly appear, a capability essential for the early self-driving cars that won competitions like the DARPA Grand Challenge.
Exposure to certain viruses can lead to an increased risk of some cancers, and it used to be thought that all or most cancers had a viral origin. But in the early 1990s, a research group at U.C. Berkeley, led by professor Mary Claire King, proposed the idea that mutations in particular genes could be an important risk factor for breast cancer. Using funding from the NIH, they were able to pursue this idea. The BRCA1 gene was discovered in 1994, and its counterpart BRCA2 followed a year later. This and subsequent research showed that people with mutations in these genes have an elevated risk of developing breast cancer, and that their cancers respond to treatments in different ways than other types of breast cancer.This research directly led to novel screening tools for breast cancer risk assessment, early detection and more targeted options for prevention and treatment – nothing short of a revolution in the standard of care for breast cancer. More generally, this research played a pivotal role in the advent of “precision medicine” in which individual patients’ genetic profiles lead to personalized risk assessment and treatment for cancer and many other diseases.
Babies born prematurely are at risk and require a lot of care in order to grow, gain weight, and thrive. Work funded by the National Institutes of Health, starting in the 1970s, led to the discovery of a surprisingly effective treatment: the simple act of touch, through 15 minutes of massage therapy three times a day, can lead to a 47% increase in the baby’s weight.The story starts with Prof. Saul Schanberg’s research group at Duke University. They knew that when rat pups are separated from their mothers, they grow less well than pups that aren’t separated. They wanted to find out what made the difference. The mother’s milk? The warmth of her body? Her comforting smell? They gradually ruled all of these out and came to a surprising conclusion: it was her touch. Stroking the pups with a gentle camera brush, in a motion that mimicked how the mother licks them, allowed the pups to thrive.Following up on this research, Prof. Tiffany Field’s group at the University of Miami asked whether similar touch therapy with humans would have a similar beneficial effect. They found that it did, and the positive effect was big. Fast forward years later, and massage therapy for preterm babies in Neonatal Intensive Care Units (NICUs) is now commonplace, helping many thousands of babies grow to be thriving children.In addition to the emotional toll of having a baby in the Neonatal Intinsive Care Unit (NICU), care for babies in the NICU is expensive. The use of massage therapy to help babies thrive and successfully leave the NICU was estimated in 2014 to save about $10,000 per baby, for a total in the U.S. of about $4.7 billion dollars every year.
Closed captioning is important in everyday life because it helps people who are deaf or hard of hearing enjoy TV, videos, and online content. It also helps people learn new languages, watch videos in noisy places, or follow along in quiet environments like libraries. As more videos are made for websites, social media, and online learning, the need for captions is growing fast. This creates more jobs for captioners and makes sure more people can enjoy and understand video content.Publicly funded research played a key role in starting closed captioning. In the 1970s, government agencies like the National Institute of Standards and Technology worked with TV stations to test early captioning systems, after being petitioned by a deaf woman to provide captions for emergency announcements. Later, the U.S. Department of Education helped fund the National Captioning Institute (NCI), which created the first captioning for live events, including news programs, sports, and the Academy Awards. NCI also developed the first captioned children’s program, Sesame Street. Federal funding for closed captioning helped a new technology grow into a valuable service that supports jobs and helps millions of people every day.
Publicly funded research played a big role in creating Google. In the 1990s, the National Science Foundation (NSF) gave money to Stanford University for a project called the Stanford Digital Library Project. This research helped graduate students Larry Page and Sergey Brin study how to organize and search through huge amounts of information online. Brin was also awarded an NSF Graduate Research Fellowship during his graduate studies at Stanford. This fellowship supported his research in computer science, providing both financial assistance and recognition for his academic potential. Page and Brin’s graduate work on ranking web pages based on importance led to the creation of Google’s search engine.Other government programs also helped support early research that made Google possible. Agencies like DARPA and the intelligence community funded projects on managing and searching big sets of data. This gave Page and Brin access to the tools and ideas they needed to build something new. With support from their professors and government grants, they turned their research into a powerful tool for finding information on the internet—changing how the world uses technology.
Asthma is a chronic respiratory disease involving inflammation and narrowing of the airways, causing symptoms like coughing, wheezing, and shortness of breath. It affects a vast number of people, with estimates suggesting over 260 million individuals globally suffered from it in recent years. Historically, treatment focused mainly on relieving airway spasms (bronchospasm), often with limited success and significant risks, before the focus shifted towards managing the underlying inflammation for better long-term control.Progress in asthma treatment heavily relies on foundational science supported by public funding. Key NIH institutes like the National Institute of Allergy and Infectious Diseases (NIAID), the National Heart, Lung, and Blood Institute (NHLBI), and the National Institute of Environmental Health Sciences (NIEHS) 1 conduct crucial research uncovering disease mechanisms and immune pathways. Additionally, NIH and the CDC support programs like the National Asthma Education and Prevention Program (NAEPP) and the National Asthma Control Program (NACP) to translate findings into guidelines and public health action. This deep understanding allows companies like Regeneron (New York) and Sanofi (France) to develop advanced therapies. A prime example is Dupixent, now benefitingover a million patients globally by specifically inhibiting the signaling of IL-4 and IL-13, key drivers of Type 2 inflammation in asthma.
Laser technology has allowed the development of computer hard-drives, barcode scanning in grocery stores, satellite broadcasting, CDs, eye surgeries, cancer treatments, 3D printing, missile defense, law enforcement, digital video, astronomy, and much more. The physics behind lasers was first proposed and developed in the 50s by Charles Townes working at Columbia University with funding provided by The Office of Naval Research , United States Army Signal Corps, and the National Science Foundation. After first developing the theory behind focusing light at non-visual wavelengths, he eventually refined his work with visual light, setting the stage for the invention of the modern laser. Lasers went on to prove immensely useful for a wide range of applications, and Townes eventually shared the 1964 Nobel Prize in Physics for his work.