Trending December 2023 # An Estimate Of How Many Americans Have Died From Each Covid Variant # Suggested January 2024 # Top 16 Popular

You are reading the article An Estimate Of How Many Americans Have Died From Each Covid Variant updated in December 2023 on the website We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested January 2024 An Estimate Of How Many Americans Have Died From Each Covid Variant

Since the winter of 2023, new coronavirus variants have shaped the COVID-19 pandemic, each of which led to sharp increases in case counts, and eventually, deaths, in the United States.

 “A significant fraction, almost half and rising, have died after the ancestral strain” of SARS-CoV-2 was replaced by variants, says Jo Walker, a graduate student at Yale and the report’s lead author. Of the more than a million Americans who had died of COVID-19 as of early May, variants killed 460,000. 

While most deaths from each variant occur during a wave’s peak, the challenge is in sorting out the moment when one variant sweeps another out of the way. When Omicron first arrived in the US this past fall, the upper Midwest was deep into a wave driven by Delta. “Those transitions are going to take place at different times and at different speeds from state to state,” says Walker.

By lining up known death tolls with Centers for Disease Control and Prevention estimates of variant prevalence in different parts of the country, the researchers could estimate what fraction of people had died from a given COVID strain. “There’s not actually a lot of complicated math going on here,” says Walker. “It’s really just that there’s a lot of data covering different locations and time periods.”

Walker says that two elements of the findings jump out. The first is the toll from Omicron: Researchers estimate this currently dominant variant has killed 110,000 people so far. That’s despite the widespread misconception that Omicron is a mild variant. Two years ago, after 100,000 Americans had died in the first spring waves, Walker points out, the New York Times ran a front page headline calling the toll “an incalculable loss.” Now, Walker says, “we see a new variant come around and it’s caused a very similar death toll in the matter of a few months,” even with vaccines widely available. The 2023 death toll has fallen most heavily on older Americans, particularly those in nursing homes.

The second is the shifting geography of the pandemic. The Northeast experienced 215 deaths per 100,000 residents before the emergence of variants. Later, variants killed a disproportionate number of people in the South—158 per 100,000 residents. That’s something epidemiologists have understood in other ways; New York City experienced the highest per-capita death tolls of the entire pandemic in April 2023, while southern states experienced prolonged outbreaks over 2023. The new analysis, though, could put specific numbers to the trend.

[Related: A deep dive on the evolution of COVID.]

But Susan Hassig, an infectious disease epidemiologist at Tulane University in New Orleans, says that this finding also illustrates serious limitations of the analysis. “The variant isn’t the only thing that drives mortality,” says Hassig, who wasn’t involved with this paper. “If we were in lockdown during Delta, far fewer people would have died.” And she attributes regional differences to those policy differences—New York City, for example, required masks in indoor settings during variant-driven waves, while many southern states didn’t.

“They didn’t really discuss one of the most interesting findings—explaining why the Northeast’s [death toll] was high[est] in the non-variant environment, and lowest in the variant environment,” Hassig says. “They left so many things on the table.” She acknowledges that incorporating data on policies like school closures or mask mandates is challenging, but says that she would like to see at least vaccination status included in the analysis. 

The authors write in the report that the analysis was intended to observe deaths, rather than explain their causes. But categorizing deaths by variant without incorporating other explanations, Hassig says, risks overemphasizing the role of variants in each wave. Deaths over the last six months are as much a product of undervaccination—among other policy outcomes—as they are of Omicron.

It’s a point that Walker acknowledged in an interview with Popular Science. “The fact that we do see a shift in burden [from the Northeast to South] implies that there’s something going on that isn’t just the variants,” Walker says.

A focus on variants alone may not explain why a million Americans have died. But this approach demonstrates the continued toll of failing to control the pandemic globally. The deadliest variants in the United States emerged overseas, although Walker says “there’s nothing about the US which means that variants don’t emerge here.”

Variants are both “cause and effect,” Walker says. More infectious strains can drive outbreaks—but they’re also a symptom of uncontrolled global spread, which creates the runway for SARS-CoV-2 to accumulate new mutations.

The point of focusing on variants, says Zain Rizvi, research director at Public Citizen, and a co-author on the report, “is it gets us to the connection between the global and local. It establishes that what happens in Lahore really matters for what happens in Louisiana.” That’s a message epidemiologists have yelled before, but it’s increasingly urgent as Congress’ appetite for pandemic funding dries up. 

“We see the staggering cost imposed on the US population,” Rizvi says, “and yet we still are waiting for governmental action to help reduce the risk of new variants emerging globally and to protect the lives of Americans at home.”

You're reading An Estimate Of How Many Americans Have Died From Each Covid Variant

How Our Pandemic Toolkit Fought The Many Viruses Of 2023

COVID-19 caused headlines again this year, but it was matched by a slew of other newsworthy viruses: the adenoviruses suspected to be behind the rise in hepatitis cases in early spring, the outbreak of mpox—formerly known as monkeypox—in the summer, an early surge in respiratory syncytial virus (RSV), and a peak in influenza cases following the Thanksgiving holiday season. Each of these viruses has tested clinicians, epidemiologists, and virologists. But these experts have responded by employing some of the tools that were built during the COVID pandemic.

The beginning of 2023 brought the first trial run for our toolkit: huge numbers of COVID cases, caused by the emergence of the highly transmissible Omicron variant. Virologists had to re-enact the early days of the pandemic: identifying the strain, testing its disease severity, and understanding its ability to escape the immune system. The available COVID vaccines were pitted against Omicron, and thankfully, showed good efficacy. By now, these studies were familiar, and early results were shared quickly to inform how public health officials around the world acted to protect populations.

After the initial surge of cases, in spring of 2023, many jurisdictions began to reduce COVID testing and tracing. The Centers for Disease Control and Prevention (CDC) changed its guidance on face coverings, so fewer people wore masks out and about. Still, researchers continued to track Omicron and its subvariants, and those who’d worked at speed to understand the latest strain would get little respite—2023 had more pathogens to throw at them yet.

Genome sequencing predicts viral spread

Monitoring mutations is a virus-fighting tool that had been employed early in the pandemic, because it’d been proven to help many times before. Since 2008, researchers sequencing all types of viruses have been able to upload whole genomes to GISAID, a science surveillance initiative. Their work had allowed for quick research at the start of the H1N1 flu pandemic in 2009 and during the 2013 bird flu epidemic. 

“When the unknown coronavirus emerged in January 2023, GISAID had already played a key role in influenza surveillance for 12 years,” says Sebastian Maurer-Stroh, executive director of the Bioinformatics Institute in Singapore and a collaborator with GISAID. The collaborative’s array of tools, though designed for tracking flu viruses, had been built in connection with the research community and large organizations like the World Health Organization (WHO). These tools were relatively easy to adapt to track the spread of COVID, Maurer-Stroh says. 

[Related: The World Health Organization officially renamed monkeypox to mpox]

GISAID’s database of SARS-CoV-2 genomes has helped research into the pathogen’s spike protein, the area on the virus that affects how it enters our cells and causes infection. It’s also meant that countries can monitor the rise and fall of different strains in their populations and make changes to guidelines accordingly. Though submissions of new SARS-CoV-2 genomes started to trail off in early 2023, GISAID and the WHO are still tracking Omicron and the emergence of subvariants. 

But in May 2023, GISAID researchers noticed a new genome being uploaded. The hMpxV virus and the disease it caused, mpox, was already endemic in countries in Africa, but rarely caused infections outside the continent. GISAID surveillance showed that there were new lineages spreading rapidly, and by July the virus was present in 75 countries. That month, the WHO declared the outbreak to be a public health emergency. Cases have been steadily dropping since then, though the WHO reports that seven countries are still seeing new cases. As of December 15, there have been more than 80,000 mpox cases worldwide.

Wastewater provide breadcrumbs for disease outbreaks

At the same time as GISAID was monitoring DNA sequences of the mpox virus, researchers were employing another surveillance tool used during the pandemic. Wastewater taken from July to October in  New York showed that poliovirus was circulating in six of 13 sampled counties.

Wastewater sampling had detected COVID in sewers back in April 2023; in September of that year, the CDC launched the National Wastewater Surveillance System (NWSS) to monitor virus levels. Compared to mass-scale PCR testing, testing wastewater offered an easy and unobtrusive way to find out where there were hotspots of virus activity. 

“You can track a lot of viruses in the wastewater, and what we’re seeing with COVID is that it may be an easier way of doing epidemiology, at least on a bigger picture scale,” says virologist Michael Teng, of the University of South Florida Department of Molecular Medicine. Wastewater surveillance can’t pinpoint individuals, so it won’t help identify potential “superspreaders” before they infect others. But it’s a great tool for virologists to see general geographical trends in virus levels.

[Related: Polio is officially circulating in the US again]

The poliovirus spread in the state was “silent,” but posed a real threat. Cases of polio had been basically non-existent in the US since the introduction of the polio vaccine, which has an average uptake of 92 percent in kids across the country—though some counties’ rates of vaccination are as low as 37 percent.

Vaccines fight viruses in and across individuals

As evidenced by the pandemic, vaccine uptake is one of the–if not the–best tools for stopping the spread of a virus. COVID vaccines protect against infection, and if you do get the disease, you’re less likely to have severe illness if you’ve been vaccinated.

So when researchers predicted a tripledemic of COVID, the flu, and RSV heading towards the US, the message was clear: Get your flu shot and COVID booster. But with no RSV vaccine available, case numbers quickly rose in young children and elderly population.

“We had a COVID vaccine within about 11 months of when the first virus sequence came out well, but RSV was first identified in 1957, and since then we have not really had good vaccines,” says Teng, whose focus is on the respiratory pathogen. “But one of the really exciting stories for this year is that Pfizer [who developed one of the COVID vaccines] along with GSK have had really good results in tests for an RSV vaccine for the elderly.”

[Related: Fighting RSV in babies starts with a mother’s antibodies]

Teng says the purchase of COVID vaccines led to an infusion of capital in companies like Pfizer and Moderna, the latter of which has been able to invest into research it began long before the pandemic. This money meant Moderna could move forward with several vaccines in development, according to Teng, including one for HIV.

These important elements of tackling viruses in 2023—genomic monitoring, wastewater surveillance, and vaccine development—are just part of the huge fight against infectious diseases. There is, of course, still a lot we don’t know about COVID and other viruses, and we cannot predict what 2023 will bring. But researchers are armed with more information about the spread of viruses than ever before, and they’ve already begun putting the pandemic’s teachings into practice.

My Ancestor Died Of A Splinter. Wait, What?

Annie Hortense Crawford’s death was a long, dramatic affair. According to her front-page obituary in the California Democrat, the local paper serving California, Missouri, Mrs. Crawford’s demise had begun a full week prior to her 1930 demise. First, there was severe pain in her hand. Slowly, a creeping debility overcame her entire body. “She grew steadily worse throughout the day and evening,” the newspaper reported, “until the end came.”

It’s easy to imagine my great-great-great-grandmother (for that’s who Crawford was) was felled by some larger-than-life illness. But the reality is a little different: Crawford died of a splinter.

Reading the details of her rapid decline almost 90 years later, I was struck by the historic nature of her death—almost unfathomable to Americans today—and set out to find out why, exactly, people don’t die of splinters anymore. In the process I discovered the peculiars of her death weren’t, in fact, all that peculiar. In fact, unless we change our relationship to antibiotics, death by splinter could be familiar once again.

The Octagon Ward at John Hopkins was part of a hospital-wide effort to stop air from circulating between care units. Wikipedia

When Crawford was born in 1860, many Westerners still attributed disease to miasma—bad air—or an imbalance in the bodily humors like blood and bile. Doctors, just as they had since the days of ancient Greece, treated all manner of illnesses with things like fresh air, rest, and even bloodletting. It’s no surprise, then, that most people in this era died of their infections and many diseases considered curable today killed thousands.

“We had a gross misunderstanding of things we called blood poisoning and things we now recognize to be infectious disease,” says Duane J. Funk, a physician and sepsis expert at the University of Manitoba. But over the course of Crawford’s life, enterprising researchers drove an incredible shift in the practice of science and, most importantly, how we think about infection.

In the late 1850s, the French scientist and father of microbiology Louis Pasteur set about disproving the common theory of spontaneous generation. At that time, many people believed that agents of decay—the things that molded bread or rotted a peach—magically appeared from within the bread or peach itself. By showing that microorganisms came from elsewhere—that they infected a body—Pasteur established the basic mechanism of infectious disease. He went on to develop the earliest technique for pasteurization, as well as rabies and anthrax vaccines.

Other scientists subsequently sought to validate and expand on Pasteur’s ideas. Though he was ridiculed at first, the inquisitive surgeon Joseph Lister ultimately proved that carbolic acid had a sterilizing effect on open wounds and, when properly applied, saved lives. In 1890, the German physician Robert Koch published his unified “germ theory.” Koch’s postulates displaced miasma theory and germ theory remains the predominant explanation for infectious disease to this day.

By the time a splinter pierced Crawford’s thumb in March of 1930, scientists knew that small microbes, invisible to the naked eye, could invade a human body and feast until the host recovered or, more often, died. These germs, such as they were, caused everything from waterborne illnesses like cholera to sexual transmitted conditions like syphilis. They were also responsible for the disease that killed Crawford: blood poisoning.

But just because doctors of that day may have understood the biological war raging in my ancestor’s thumb doesn’t mean they could cure what ailed Crawford. It would take one moldy discovery—and more than a decade of subsequent research—before anyone could do a thing about infectious disease.

A World War I-era Red Cross poster. Wikipedia

While thousands of people still die from sepsis each year, many Americans think they are impervious to such diseases. Funk says that may be because, on a statistical level, people aren’t that susceptible to death by splinter and never really have been. “I get cuts from shaving every second or third day,” says Funk. “The question is, why do some of them get infected and some of them don’t?”

The answer, he says, starts in the skin. “As soon as you get the splinter wound or the cut, right off the bat, there’s a battle that begins,” Funk says. First, blood clotting factors swarm to the affected site. This not only stops a person from bleeding out; it also serves as a biological drawbridge, raising against any potential invader. In some cases, there may not be harmful bacteria on the afflicted site at all. But if there is, the immune system is ready. It deploys white macrophages, the body’s Roombas, to slurp up any dirt, bacteria, or other foreign objects. “Bacteria are all around us,” Funk says. “But 99 percent of the time, our immune system works great at preventing infection.”

The very young, very old, and infirm are less likely to fight back effectively, however. Genetic predispositions toward certain illnesses, how aggressive a given bacteria or virus is, and other circumstances also factor into the progression of disease. Crawford, who was 70 at the time of her death, was part of this vulnerable population. The infectious agent—like Staphylococcus aureus—was able to push past Crawford’s natural defenses, which had diminished with age, and make their way into her bloodstream.

The infection likely moved quickly from there, Funk says, thanks to the tropical heat of the human body. “Some of these bugs have a doubling time of eight to 20 minutes,” he says. “There’s two [microbes], then there’s four, eight, 16—you do the math. It doesn’t take long to have millions to billions of bacteria floating in your system.” But they didn’t just float. Division made the bacteria hungry, so they eagerly turned Crawford’s heart, lungs, liver, and other organs into food. Without medical intervention, her body was overwhelmed. Her blood pressure likely dropped suddenly. And in the absence of any suitable medical intervention, she died.

Alexander Fleming, the father of penicillin, in the lab. Wikimedia Commons

For thousands of years, the fate of sepsis patients was largely sealed. But that began to shift in 1928 when the Scottish scientist Alexander Fleming discovered penicillin, the world’s first antibiotic. Just two years before Crawford’s death, Fleming was working with a lab culture of Staphylococcus, which just so happens to be one of the two main agents that cause sepsis. He noticed what scientists call an “inhibition halo”—a line the bacteria could not cross—clearly defined on the lab specimen. A blue-green mold had contaminated the sample and inhibited bacterial growth. Fleming’s original experiment was wrecked, but the mishap presented him with an unprecedented opportunity.

Upon isolating the mold, Fleming found he had the relatively common fungi Penicillium notatum on his hands. The mold, which thrives in damp environments, easily infests water-damaged buildings. When airborne, it can cause allergic reactions in humans. But when synthesized into a bacteria-fighting drug, Fleming realized the cloudy growth could save thousands of lives. The only problem: it couldn’t be synthesized.

For a decade, Fleming tried and failed to persuade chemists and manufacturers to help him transform his fungal find into a mass-market product, knowing all the while that lives were being unnecessarily lost to infection. It was not until World War II that penicillin made its debut as a bona fide treatment. In 1941, scientists at the USDA isolated higher-yield strains of the mold, which they used to successfully treat burn victims of the 1942 Cocoanut Grove nightclub fire in Boston. At the same time, researchers at the drug company Pfizer perfected a deep-tank fermentation system that generated high-quality penicillin in industrial quantities, which allowed the drug to finally go mainstream.

But even with modern medicine, Funk says Crawford may not have recovered. The U.S. Centers for Disease Control estimates that 1.5 million Americans get sepsis each year. Whether it’s from a splinter like Crawford’s or, more commonly, a hospital-acquired infection, sepsis continues to kill approximately 250,000 Americans annually. And not only can antibiotics fail—it’s increasingly apparent they can create problems all on their own.

A close-up of MRSA. Pixino

While penicillin was still being hailed as a wonder drug, by 1942 scientists were suddenly aware of a terrifying possibility: antibiotic-resistant superbugs. Mere months after penicillin had finally been mass-produced and deployed, researchers reported the existence of penicillin-resistant bacteria. “By growing the organism in increasing concentrations of penicillin over a long period it was possible to render the organism resistant to penicillin,” Charles H. Rammelkamp and his colleagues wrote at the time.

The biggest fears of these early scientists have since been realized. Today, at least 2 million Americans experience antibiotic-resistant infections annually. Approximately 23,000 die as a result. Antibiotics have saved thousands of lives, but they have also slowly selected for even more powerful bacteria. A round of penicillin might kill 99.9 percent of the harmful bugs in a person’s body, but the few organisms that live are stronger than average and now they’re free to breed wildly. Given the right environment—like a weakened immune system in an ICU patient—the already-scary Staphylococcus aureus can transform into methicillin-resistant Staphylococcus aureus, more commonly called MRSA.

About 720,000 Americans acquire infections while in the hospital in 2011, according to a CDC report. And for every three people who die in the hospital, one dies of sepsis. While doctors are working on instituting new protocol to reduce the risk of superbugs, like minimizing the use of ventilators, which can cause pneumonia, and carefully tailoring treatments to the specific bacteria regaining control over these pathogens has proven difficult.

Despite increasing awareness, doctors continue to overprescribe antibiotics and patients continue to quit an antibiotic regimen before they’re supposed to. At the same time, the livestock industry consumes 70 percent of the antibiotics in the United States to keep their animals healthy—all the while breeding antibiotic-resistant meat, soils, and even farmers. A 2014 report from the United Kingdom predicted 10 million annual deaths due to antibiotic resistance by 2050. While experts still quibble over the impending tsunami of deaths from antibiotic resistance, one thing is clear: People rarely die of splinters in 2023, but 2080 is looking a little different.

Poring over the reports on antibiotic resistance, I’m reminded of a dystopian novel I once read called Station Eleven by Emily St. John Mandel. At one point in the book, the main character describes her brother’s death as “The kind of stupid death that never would’ve happened in the old world. He stepped on a nail and died of infection.” While Station Eleven was a work of fiction, I remember feeling a jolt down my spine when I read that line for the first time. My great-great-great-grandma’s all-too-real obituary (“The injury was so insignificant that she thought nothing of it… until the end came”) gave me the same sensation. Reading the detailed account of Crawford’s death, I can’t help but think this brave new world looks a lot like the old one she left behind.

Types Of Data Scientists: An Array To Choose From

Data scientists come in numerous flavors with various qualities that may suit various kinds of companies relying upon the sorts of issues or projects

Data Scientists have consistently been around – it is only that nobody realized that the work that these individuals are doing is called data science. Data Science as a field has emerged distinctly over the recent few years yet individuals have been working in the data science field as analysts, mathematicians,learning and actuarial scientists, business analytic practitioners, digital analytic consultants, quality analysts and spatial data scientists. Individuals working under these jobs are well furnished with data scientist skills and they are most demanded in the business. Data science has quickly developed as a challenging, lucrative and highly rewarding career. While developed nations got comfortable with it part of the way through the last decade, data science has received consideration on a worldwide scale after the exponential development of e-commerce in developing economies, particularly India and China. In the previous decade, there has been a significant change in perspective in the way the world shops, books holidays, makes transactions and basically everything else. Not all data scientists are made equal, particularly now that few “generations” of data scientists have entered and left organizations. Today, data scientists come in numerous flavors with various qualities that may suit various kinds of companies relying upon the sorts of issues or projects they are taking a shot at. Not to state that one sort is better or worse over another kind of data scientist — everything relies upon what a business is looking for.  

Management Consultant

This classification traverses the junior business analyst and the ex McKinsey consultant. They share a common enthusiasm for Excel and their capacity to flaunt v-lookups and fancy formulas even to plan their house move. They are additionally the ones who have more passion for the business issue. For them, business comes first, data after. They needed to learn Python or R by need, not on the grounds that they enjoyed programming. They actually try to abstain from coding as much as possible and their code is by and large as re-usable as a single-use napkin. They have great instincts for the nuts and bolts of statistics however, they needed to learn concepts like p-worth or t-test the most difficult way possible. They are good at data science projects that bolster decision making, business-oriented processes, one-off projects.  


This is data analysis in the conventional sense. The field of statistics has consistently been about number crunching. A solid statistical base qualifies you to extrapolate your enthusiasm for various data scientist areas. Hypothesis testing, confidence intervals, Analysis of Variance (ANOVA), data visualization and quantitative research are some of the important skills possessed by statisticians which can be extrapolated to pick up expertise in explicit data scientist fields. Statistics knowledge, when clubbed with domain knowledge, (for example, marketing, risk, actuarial science) is the ideal blend to land a statistician’s work profile. They can create statistical models from big data analysis, complete experimental design and apply theories of sampling, clustering and predictive modelling to information to decide future corporate activities.  

Data Science for People

The consumers of the yield are leaders like chiefs, product managers, designers, or clinicians. They need to reach inferences from data so as to settle on decisions, for example, which content to license, which sales lead to follow, which medication is less inclined to cause a hypersensitive response, which site page design will prompt greater engagement or more buys, which email will yield higher income, or which explicit aspect of a product user experience is suboptimal and needs attention. These data scientists design, define, and implement metrics, run and interpret experiments, create dashboards, draw causal inferences, and generate recommendations from modeling and measurement.  

Academia Data Scientist

They often have a PhD and originated from a research background. They examined hardcore math and statistics and they could talk hours about the philosophical contrasts between the Bayesian and frequentist methods. They are typically alright at coding, as long as they don’t need to propel themselves a lot into the boundaries of data engineers. A test-driven programming approach may be a stretch for them. However, they are presumably acceptable at lower-level projects, for example, C++, which could come helpful for applications at large scale or deep learning. What they in general need is business thinking. Building up a product is most likely the ultimate objective for them since they perceive that as the equivalent of publishing a paper in academia. They are good at complex ML ventures at the front edge of development. They can push boundaries, go through a lot of research papers to pick and implement the best thoughts. A deep-tech organization would presumably require a small bunch of those profiles.  

Actuarial Science

Actuarial Science has been around for quite a while. Banks and financial establishments depend a lot on actuarial science to anticipate the economic situations and decide the future salary, income, profits/losses from these mathematical algorithms. It is possible to be an actuarial scientist without taking up any data science training. However, a data scientist will have an awesome handle over the mathematical and statistical algorithms that are required for actuarial science. A ton of organizations are currently speeding up the cycle by employing CFAs to accomplish the work of an actuarial researcher. This is a specific position which requires data science experts to apply mathematical and statistical models to BFSI (Banking, Financial Services and Insurance) and other related professions. One must have a globally defined range of abilities and exhibit it by passing a progression of expert assessments before going after this position. Preliminary necessity is to know various interrelated mathematical subjects, for example, probability, statistics, finance, economics, financial engineering and computer programming.  

Data science for Machines Conclusion

Myths Vs. Facts: Making Sense Of Covid

Myths vs. Facts: Making Sense of COVID-19 Vaccine Misinformation

Photo by Jackie Ricciardi

Public Health

Myths vs. Facts: Making Sense of COVID-19 Vaccine Misinformation When so much wrong information is readily available, convincing people to get vaccinated has proven to be a huge challenge

Myth: pronounced mith; noun; definition: a widely held but false belief or idea; synonyms: misconception, fallacy, fantasy, fiction.

Among the many reasons COVID-19 vaccination rates in the United States peaked earlier than experts hoped—then, rather than crescendoing into the summer months, began trending downward—are myths that took hold among the unvaccinated and solidified as their reasons not to get the shots. The vaccine will make women sterile; the vaccines are too new; the shots have a microchip in them; the vaccine itself will give me COVID; I’m immune because I had COVID; breakthrough cases prove vaccines are useless.

There are more. And none of them are true. 

So let’s cut to the chase. Myth vs. Fact. The Brink took some of the most widespread myths to two leading infectious disease experts, Davidson Hamer, a faculty member of BU’s School of Public Health, School of Medicine, and National Emerging Infectious Diseases Laboratories, and Sabrina Assoumou, a BU School of Medicine assistant professor of medicine and of infectious diseases and a Boston Medical Center physician.

If these two experts encountered someone on the street who cited one of these myths as their reason not to get vaccinated, this is what they would say to them. To provide extra context, we include one more fact.

MYTH: The COVID vaccines were not rigorously tested, which is why they have only emergency authorization approval and not full Food and Drug Administration approval. (Update: Pfizer’s vaccine received full FDA approval on August 19)


“Vaccine developers didn’t skip any testing steps, but conducted some of the steps on an overlapping schedule to gather data faster.”—Johns Hopkins Medicine

Assoumou: This is the most common question I get asked. I think there is a perception that things moved very fast, but we want to underscore that the technology being used now was being studied for a decade. The main difference between emergency use versus full FDA approval is that you need two months of monitoring rather than six months. When you look at the history of vaccines, if patients were to develop side effects, these occurred within two months. We are now over six months into our experience with these vaccines. We have not seen anything that would make us believe that the risks outweigh the benefits. And vaccines have saved so many lives.

Hamer: The development was more rapid than many other vaccines. But it used the same process of phase one and phase two trials following appropriate safety measures. Stage three trials were large-scale trials done rigorously with very clear outcome definitions. The safety measures and approaches taken are standard for clinical trials. They just did it more rapidly than usual. The full process review is ongoing and we are already hearing that Pfizer will have full FDA authorization by September and Moderna soon after.

MYTH: The technology used to create the COVID vaccines is too new to be safe.


The technology used, called messenger RNA, or mRNA, is not new. Research on it actually began in the early 1990s, and two diseases that are very close to COVID—SARS (severe acute respiratory syndrome) in 2003, and MERS (Middle East respiratory syndrome)—helped bring the mRNA vaccine development to present day use.—Centers for Disease Control and Prevention, Understanding mRNA COVID-19 Vaccines

Assoumou: The reason this is called SARS-COV-2 is that there was a SARS-1, the original one, and scientists were working on this vaccine. So when this pandemic arrived they had already developed a lot of the science. A decade of work was actually going on. That’s one issue I like to emphasize when people think it was rushed.

The other point I like to remind people is that these vaccines went through all the regulatory steps like any other vaccines. None of this was rushed. The FDA reviewed all the data. When you say “Emergency use,” people think it was rushed, but the way to think about it is that the benefits outweigh the risks.

MYTH: Breakthrough cases prove that even if I get the vaccine, I might still get COVID. So why bother?


As of August 9, the CDC said there had been 8,054 vaccinated people who were hospitalized or died who had also tested positive for coronavirus—out of more than 166 million fully vaccinated Americans. That’s roughly .005 percent. Additionally, CDC director Rochelle Walensky has said that 99.5 percent of all deaths from COVID-19 are in the unvaccinated.—Politifact, Fact Checking Joe Biden’s Figure on Unvaccinated COVID-19 Deaths

Hamer: COVID vaccines have been shown to be very powerful in preventing more severe disease and the need for hospitalization. Breakthroughs occur at a much, much lower rate than in people who are unvaccinated. The breakthroughs have been occurring more frequently with the Delta variant because of the high level of infectiousness (or transmissibility) of the Delta variant and lower protection of current vaccines against this variant. But people having breakthroughs have much more mild infection, more like an upper respiratory infection. The vaccines prevent severe disease and complications and allow people to return to a more normal state. 

Assoumou: I was just at the hospital taking care of patients. I can tell you all the cases of people getting hospitalized are unvaccinated. Breakthrough cases account for much less than 1 percent. There are so many zeros before the one—99 percent of people dying now of COVID are unvaccinated. And 97 percent of those hospitalized are unvaccinated. We are just not seeing large numbers of people vaccinated being hospitalized. And if you get it, for the most part it is like having a cold.

MYTH: The COVID vaccines can affect a woman’s fertility.


This rumor started after a report claimed inaccurately, yet circulated on social media, that the SPIKE protein on this coronavirus was the same as another protein called syncytin-1 that is involved in the growth and attachment of the placenta during pregnancy. It was quickly debunked as false by the scientific community.—STAT News, Shattering the Infertility Myth

Hamer: I think people were worried that the messenger RNA in these vaccines messes with their genes. It doesn’t. It doesn’t even make it into the nucleus of your cells. It won’t intervene with any metabolic activity. 

Assoumou: Ohh, this is so common that I hear this. For young women of childbearing age, it’s a common question. It started with a report that was incorrect and has been debunked. But unfortunately, once the information gets out there, the correct information doesn’t always come through. I tell people that when we look at the mechanism by which these vaccines work, we see that they simply don’t impact fertility.

MYTH: I already had COVID, therefore I don’t need the vaccine. I’m immune.


“After people recover from infection with a virus, the immune system retains a memory of it,” the National Institutes of Health explains. While that’s good for the immune system, it also means that even after you recover from COVID, it’s still inside your body and can resurface. Studies have been unclear how long immunity lasts after having COVID—most experts believe anywhere from 90 days to six months, though it could be longer.—National Institutes of Health

Assoumou: That’s a very common one. The information we have right now is that vaccines provide a more broad-based immune response that will protect you for a longer period of time. With the mRNA vaccines, you have two shots, one to prime and then another one to boost the immune system. You need the boost to protect you for a longer period of time.

Hamer: After three to six months or so, the natural immunity begins to wane and the risk of reinfection returns. We are definitely seeing people develop reinfections. Receiving the vaccine after having COVID is like a booster effect, and therefore it’s much more effective. Studies have been done comparing those who had the disease versus those who did not, and those who got at least one shot after having COVID end up with very high levels of antibodies.

MYTH: Children do not need to be vaccinated because they do not become sick from COVID-19.


“Hundreds of children in Indonesia have died from the coronavirus in recent weeks, many of them under age 5.” A five-year old boy in the state of Georgia died of coronavirus in July.—The New York Times , and CNN

Assoumou: That’s a very common one. It is true that children are not dying at the same rate as we are seeing in older adults. But children are going to grow up to be adults. We want to protect them as soon as possible. In addition, we are seeing some of the consequences of COVID. Not only deaths. But there is also multisystem inflammatory syndrome (MISC) of children, where children get very sick, and we are still figuring out the details and long-term complications of this syndrome. Then there are emerging data that children are developing long COVID [symptoms that linger]. We have vaccines that are safe and effective in children 12 and above and we’re hoping we’ll have it soon for younger kids.

Also, children are part of the “herd.” When we talk about “herd immunity,” we are referring to the level of immunity when the disease stops spreading in the community. Children are part of the population. Now that we have the Delta variant, we’re going to have to get to an even higher percentage of the population vaccinated to reach population level immunity. Children are part of the community. It will be harder to get to some normalcy if a large proportion of the population remains unvaccinated.

MYTH: I’m vaccinated. So I can drop all my COVID precautions, right?


Studies have shown that a person infected with the Delta variant of COVID has roughly 1,000 times more copies of the virus in their respiratory tracts than a person infected with the original strain.—CDC, Delta Variant: What We Know About the Science

Assoumou: The vaccines are safe, and remarkably effective. But what precautions to take will decide on a lot of factors. For example, where you live. Are you in a place with high vaccination coverage, like Massachusetts, or a southern state with low vaccination coverage and high case rate. It also depends on what activity you are engaging in. Outside not in a crowd, that’s safe. You don’t need a mask, but inside in a crowd where you don’t know who is vaccinated or unvaccinated, then you may still want to follow public health measures. If you have children less than 12, like I do, then you also need to be a little more cautious. In addition, if you have a compromised immune system, then you also need to take some precautions. If you happen to be in a place with high vaccination coverage and a lower case rate, then it might depend on your level of comfort for risk.

I do want to remind people there are still places we should mask up, like the doctor’s office or on public transportation.

MYTH: Getting the COVID vaccine actually gives you COVID.


It is not medically possible. The vaccine does not contain the virus.—Johns Hopkins Medicine

Hamer: COVID vaccines are not made with live virus SARS-COV-2 virus cells. They are not giving individuals the virus itself so you can’t get COVID from getting the vaccine.

MYTH: A microchip, with the backing of Bill Gates, is being implanted with the vaccine.


This one started when Microsoft cofounder Gates said in an interview: “We will have some digital certificates” that could ultimately show who’s been tested and who’s been vaccinated. (Alas, he never mentioned microchips.)—BBC, Coronavirus: Bill Gates Microchip Conspiracy Theory

Assoumou: If you are worried about being monitored, just look at your phone. You’re much more likely to have your activities tracked there. There is no microchip in the vaccine. 

(When asked if she is able to able to keep a straight face when someone brings up the microchip to her, Assoumou said: “You have to be empathetic, so that people know you are listening to them.”)

Explore Related Topics:

How To Estimate Software Bugs? And Should You?

To say that it is a somewhat a hot topic among teams around the globe won’t be an overstatement. But should you estimate software bugs?

First, let’s look at the options for estimation that we have in the first place.

Dedicated Time for Bug Fixing

Teams often implement the practice of dedicating a specific time of each sprint / day / week / month to bug fixing instead of estimating bugs using bug management tools, beforehand.

They only estimate if, after the initial investigation, it turns out to be a bigger fix or requires a change to the behavior of the product. It’s more likely that they will treat the bug fix like a feature, that might undergo the complete process of specification, design, development, testing, and release.

Default Estimation

Default estimation is another way of estimating bugs, using 0.5 to 1 days as unit values since most bugs don’t take more than a day to get fixed. Some teams have also taken this method to an extreme and treat all tickets like this.

It’s because not only do the things average out over a period of time, but also people get more comfortable with each other and the tickets created become roughly the same size.

Estimation with Historical Data

With enough data, you have the power to create a much more contextual system that would make use of that historical data from your bug management tools to predict the time it will take to fix a certain bug using Natural Language Processing, Machine Learning, and other approaches.

No Estimation

And then there is another school of thought which believes that since you can’t estimate the time it will take to fix a certain bug until you’ve located the problem, trying to come up with an estimate is pointless.

So Should You Estimate Bugs?

Also read: Best CRM software for 2023


Very logical reasons back both sides of the argument; to estimate or not to estimate.

For: At least one engineer knows the exact source of the bug and how to fix it.

Against: Some bugs are so obscure that it’s hard to predict the time it will take to fix them. In such situations, it’s better to use the default estimation or don’t estimate at all.

There’s no problem in underestimating and overestimating sometimes, but something needs to be adjusted if your estimations are not on point all the time.

Ray Parker

Ray Parker is a senior marketing consultant with a knack for writing about the latest news in tech, quality assurance, software development and testing. With a decade of experience working in the tech industry, Ray now dabbles out of his New York office.

Update the detailed information about An Estimate Of How Many Americans Have Died From Each Covid Variant on the website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!