Deadly rugby virus spreads in sumo wrestlers




Yüklə 274.99 Kb.
səhifə6/7
tarix23.04.2016
ölçüsü274.99 Kb.
1   2   3   4   5   6   7

No care packages


"The experiment would consist of placing a crew on the space station for say seven or eight months, then taking them from the station and landing them on the Moon and asking them to survive there for nine months to a year, with no further assistance other than what they have brought," says Griffin.

"After that, return them to the space station for another six or seven months and then back to Earth. All with no extra assistance – because that is what it will be like when we go to Mars," Griffin continued. "Unless we can do that experiment successfully, the first crew to go to Mars will not come back."

Griffin is not alone in this uncompromising view. A raft of space agencies, such as the China National Space Administration and the European Space Agency, want to cooperate on crewed Mars missions in the future, potentially using the ISS and the Moon as staging posts.

"I fully agree with what Mike says," says Jean-Jacques Dordain, director general of ESA. "We need to know much more about the Moon and Mars and how humans can use the resources in situ, not launch every kilo of stuff they will ever need. That's why in the meantime a lot of robotic missions to both the Moon and Mars are so very important."



Study reveals specific gene in adolescent men with delinquent peers

But family environment can tip the balance for better or worse

TALLAHASSEE, Fla. -- Birds of a feather flock together, according to the old adage, and adolescent males who possess a certain type of variation in a specific gene are more likely to flock to delinquent peers, according to a landmark study led by Florida State University criminologist Kevin M. Beaver.

"This research is groundbreaking because it shows that the propensity in some adolescents to affiliate with delinquent peers is tied up in the genome," said Beaver, an assistant professor in the FSU College of Criminology and Criminal Justice.

Criminological research has long linked antisocial, drug-using and criminal behavior to delinquent peers -- in fact, belonging to such a peer group is one of the strongest correlates to both youthful and adult crime. But the study led by Beaver is the first to establish a statistically significant association between an affinity for antisocial peer groups and a particular variation (called the 10-repeat allele) of the dopamine transporter gene (DAT1).

However, the study's analysis of family, peer and DNA data from 1,816 boys in middle and high school found that the association between DAT1 and delinquent peer affiliation applied primarily for those who had both the 10-repeat allele and a high-risk family environment (one marked by a disengaged mother and an absence of maternal affection).

In contrast, adolescent males with the very same gene variation who lived in low-risk families (those with high levels of maternal engagement and warmth) showed no statistically relevant affinity for antisocial friends.

"Our research has confirmed the importance of not only the genome but also the environment," Beaver said. "With a sample comprised of 1,816 individuals, more than usual for a genetic study, we were able to document a clear link between DAT1 and delinquent peers for adolescents raised in high-risk families while finding little or no such link in those from low-risk families. As a result, we now have genuine empirical evidence that the social and family environment in an adolescent's life can either exacerbate or blunt genetic effects."

Beaver and research colleagues John Paul Wright, an associate professor and senior research fellow at the University of Cincinnati, and Matt DeLisi, an associate professor of sociology at Iowa State University, have described their novel findings in the paper "Delinquent Peer Group Formation: Evidence of a Gene X Environment Correlation," which appears in the September 2008 issue of the Journal of Genetic Psychology.

The biosocial data analyzed by Beaver and his two co-authors derived from "Add Health," an ongoing project focused on adolescent health that is administered by the University of North Carolina-Chapel Hill and funded largely by the National Institute of Child Health and Human Development. Since the program began in 1994, a total of nearly 2,800 nationally representative male and female adolescents have been genotyped and interviewed.

"We can only hypothesize why we saw the effect of DAT1 only in male adolescents from high-risk families," said Beaver, who will continue his research into the close relationship between genotype and environmental factors -- a phenomenon known in the field of behavioral genetics as the "gene X environment correlation."

"Perhaps the 10-repeat allele is triggered by constant stress or the general lack of support, whereas in low-risk households, the variation might remain inactive," he said. "Or it's possible that the 10-repeat allele increases an adolescent boy's attraction to delinquent peers regardless of family type, but parents from low-risk families are simply better able to monitor and control such genetic tendencies."

Among female adolescents who carry the 10-repeat allele, Beaver and his colleagues found no statistically significant affinity for antisocial peers, regardless of whether the girls lived in a high-risk or low-risk family environment.



Too many calories send the brain off kilter

An overload of calories throws critical portions of the brain out of whack, reveals a study in the October 3rd issue of the journal Cell, a Cell Press publication. That response in the brain's hypothalamus—the "headquarters" for maintaining energy balance—can happen even in the absence of any weight gain, according to the new studies in mice.

The brain response involves a molecular player, called IKKß/NF-?B, which is known to drive metabolic inflammation in other body tissues. The discovery suggests that treatments designed to block this pathway in the brain might fight the ever-increasing spread of obesity and related diseases, including diabetes and heart disease.

"This pathway is usually present but inactive in the brain," said Dongsheng Cai of the University of Wisconsin-Madison. Cai said he isn't sure exactly why IKKß/NF-?B is there and ready to spring into action in the brain. He speculates it may have been an important element for innate immunity, the body's first line of defense against pathogenic invaders, at some time in the distant past.

" In today's society, this pathway is mobilized by a different environmental challenge—overnutrition," he said. Once activated, "the pathway leads to a number of dysfunctions, including resistance to insulin and leptin," both important metabolic hormones.

Earlier studies showed that overnutrition can spark inflammatory responses in the peripheral metabolic tissues, including the muscles and liver, and therefore cause various metabolic defects in those tissues that underlie type 2 diabetes. As a result, scientists identified IKKß as a target for an anti-inflammatory therapy that was effective against obesity-associated diabetes.

Yet whether metabolic inflammation and its mediators played a role in the central nervous system remained uncertain. Now, the researchers show that a chronic high-fat diet doubles the activity of this inflammatory pathway in the brains of mice. Its activity is also much higher in the brains of mice who are genetically predisposed to obesity, they found.

The researchers report that that increased activity of the IKKß/NF-?B pathway can be divorced from obesity itself -- infusions of either glucose or fat into the brains of mice alone led to this inflammatory brain reaction.

Further studies revealed that this activity in the brain leads to insulin and leptin resistance. Insulin lowers blood sugar by causing cells of the body to take it up from the bloodstream. Leptin is a fat hormone important for appetite control.

Moreover, the researchers found that treatments preventing the activity of IKKß/NF-?B in the animals' brains protected them from obesity.

While chronic inflammation is generally considered a consequence of obesity, the new results suggest the inflammatory reaction might also be a cause of the imbalance that leads to obesity and associated diseases, including diabetes. As Cai says, it appears that inflammation and obesity are "quite intertwined." An abundance of calories itself promotes inflammation, while obesity also feeds back to the neurons to further promote inflammation in a kind of vicious cycle.

The findings could lead to treatments that might stop this cycle before it gets started.

"Our work marks an initial attempt to study whether inhibiting an innate immune pathway in the hypothalamus could help to calibrate the set point of nutritional balance and therefore aid in counteracting energy imbalance and diseases induced by overnutrition," the researchers said. "We recognize that the significance of this strategy has yet to be realized in clinical practice; currently, most anti-inflammatory therapies have limited direct effects on IKKß/NF-?B and limited capacity to be concentrated in the central nervous system. Nonetheless, our discoveries offer potential for treating these serious diseases."

If realized, such a strategy would likely offer a safe approach given that the critical pathway appears to be unnecessary in the hypothalamus under normal circumstances, they noted.

The researchers include Xiaoqing Zhang, University of Wisconsin-Madison, Madison, WI; Guo Zhang, University of Wisconsin-Madison, Madison, WI; Hai Zhang, University of Wisconsin-Madison, Madison, WI; Michael Karin, University of California, San Diego, La Jolla, CA; Hua Bai, University of Wisconsin-Madison, Madison, WI, and Dongsheng Cai, University of Wisconsin-Madison, Madison, WI.

Cross kingdom conflicts on a beetle's back

BOSTON, Mass. (Oct. 2, 2008)—Researchers from Harvard Medical School and the University of Madison-Wisconsin have discovered how beetles and bacteria form a symbiotic and mutualistic relationship—one that ultimately results in the destruction of pine forests. In addition, they've identified the specific molecule that drives this whole phenomenon.

The context of this discovery can easily be imagined as a story arc that includes some of the most unlikely characters and props.

Setting: The interior of a pine tree.

Enter the protagonist: The pine beetle, boring its way through the bark, a five millimeter arthropod ready to go into labor and lay a few hundred eggs. Tucked in a specialized storage compartment in its shell, the beetle has a ready supply of spores for Entomocorticium, a nourishing fungal baby food for the beetle's gestating larvae.

Enter the antagonist: The mite, a microscopic interloper that secretly hitched a ride on the beetle.

Conflict: Unbeknownst to mother pine beetle, the mite has snuck in a supply of Ophiostoma minus, a pathogenic fungi that can wipe out the entire supply of fungal larvae food. The mite releases this toxin.

Climax: Will the baby beetles die of starvation?

Resolution: Catching the mite off guard—as well as the scientists conducting the study!—the mother beetle is ready with actinomycetes, a bacteria that neutralizes the toxic fungi by means of a tiny fatty acid.

Conclusion: While actinomycetes rescues the baby beetles from certain starvation, the larvae-friendly Entomocorticium softens up the pine, allowing the fledgling beetles to eat not only the fungi but the tree itself. Soon, the young beetles leave to begin their new lives. Mother beetle gathers up the remaining supply of Entomocorticium and heads for another tree. The beetles live, and the infernal mite is thwarted.

Surprise ending: The camera pans back, and we quickly realize that the beetles' success has cost the tree its life. An aerial view reveals miles and miles of dead pine forest, and, as the ominous audio track implies, scores of pine beetles will continue moving from tree to tree leaving ravished forests in their wake.

"So you have a beetle, a mite, a tree, two kinds of fungi, and a bacterium," says Jon Clardy, Harvard Medical School professor of biological chemistry and molecular pharmacology who, along with Cameron Currie from the University of Madison-Wisconsin, is co-senior author on the study. "Discovering this particular bacterium, and the active molecule, has added the molecular dimension to this chemical ecology of this complex multi-lateral system. It highlights the importance of bacteria in ways that people don't really even think about."

The findings will be published in the October 3 issue of Science.

The ground work for this study began in 1999 when Currie published a paper demonstrating how leafcutter ants mediate their fungal environment through bacteria. Suspecting that this phenomenon may be common throughout the animal kingdom, Currie teamed up with Clardy to examine the pine beetle.

Pine beetles are like little landscape engineers, drilling through the bark and into pine trees, using fungus to create an environment in which to lay their eggs. As a result of this activity, thousands of miles of trees are destroyed each year, often resulting in widespread forest fires. Regions such as western Canada are particularly affected by this.

Experts have known that just like the fungus-growing ants, pine beetles also use fungus to feed their larvae, and that they often managed to avoid the adverse affects of pathogenic fungi often present in the tree. But the precise means by which they interact with fungal microbes has never been demonstrated.

Currie and research assistant Jarrod Scott discovered that the beetle carries a bacteria in a specialized compartment, and after a series of experiments found that the bacteria produced an antifungal agent that killed the pathogenic fungi snuck in by the trespassing mite.

In order to delve deeper into how the bacteria works, Dong-Chan Oh, a postdoctoral researcher in Clardy's Harvard Medical School lab, used a variety of laboratory tools, such as nuclear magnetic resonance techniques and chromatography, to both locate the molecule and identify its structure. The molecule turns out to be a kind of fatty acid.

"It's becoming clear that symbiotic relationships between plants, animals, and microbes are essential for the diversification of life and evolution of organisms," says Currie. "This is an example of a system where we have insights into the importance of the diversity of microbes. We believe that this type of mutualism is widespread."

In addition, the researchers suspect that this association represents a source of small molecules that can be used in medicine.

"This molecule is nature's anti-fungal," says Clardy, "and it looks like there are a lot of them."

This is particularly significant, since pathogenic fungal infections in people are a major health concern. These infections are often fatal, and at the moment, no reliable medications for them exist. Here, however, we have an example of an antibiotic successfully disabling a powerful fungi.

"This particular molecule is too unstable to be a viable candidate," says Clardy. "Still, we need to study how it kills fungi, learn the mechanisms. We can look into other bacterial genomes and investigate other anti-fungal processes."

Suspecting that this symbiotic dynamic is far more the rule than the exception, Clardy and Currie are investigating other insect species as well to see how universal this "story arc" is.



This research was funded by the U.S. Department of Agriculture, National Institutes of Health, and the National Science Foundation. The funding and data sources for this study had no role in study design; in the collection, analysis, and interpretation of data; or in the writing of the report.

Written by David Cameron

FINDINGS

There's far more to a pine beetle's back than a hard black shell. Researchers have found that these tiny creatures—responsible for rampant and widespread forest destruction—carry on their backs battling species of fungi, plus a powerful antibiotic molecule that can destroy pathogenic fungi—something that no current medications have achieved.

RELEVANCE

Currently, pathogenic fungal infections are a significant clinical challenge. These findings suggest a potential new source of pharmaceuticals for that purpose. In addition, this study shows how the symbiotic relationships between plants, animals, and microbes are essential for the diversification of life and evolution of organisms.

PRINCIPAL INVESTIGATORS

John Clardy, Professor of biological chemistry and molecular pharmacology, Harvard Medical School

http://bcmp.med.harvard.edu/index.php?option=com_akostaff&Itemid=51&func=fullview&staffid=19

Cameron Curie, Associate professor of bacteriology, University of Madison-Wisconsin

http://www.bact.wisc.edu/faculty/currie/

MULTI-MEDIA Slideshow: http://hms.harvard.edu/public/news/jc2008/slideshow.html

JOURNAL Science

FUNDING U.S. Department of Agriculture, National Institutes of Health, and the National Science Foundation

Full citation: Science, October 3, 2008 Vol 321, Issue 5898

"Bacterial Protection of Beetle-Fungus Mutualism"

Jarrod J. Scott(1*), Dong-Chan Oh(2*), M. Cetin Yuceer(3*), Kier D. Klepzig(4), Jon Clardy(2†), Cameron R. Currie(1†)

Researchers reveal Epstein-Barr virus protein contributes to cancer

Researchers at the University of Toronto have shown that the EBNA1 protein of Epstein-Barr virus (EBV) disrupts structures in the nucleus of nasopharyngeal carcinoma (NPC) cells, thereby interfering with cellular processes that normally prevent cancer development. The study, published October 3rd in the open-access journal PLoS Pathogens, describes a novel mechanism by which viral proteins contribute to carcinogenesis.

EBV is a common herpesvirus whose latent infection is strongly associated with several types of cancer including NPC, a tumor that is endemic in several parts of the world. With NPC only a few EBV proteins are expressed, including EBNA1. EBNA1 is required for the persistence of the EBV genomes, however, whether or not EBNA1 directly contributes to the development of tumors has not been clear, until now.

In this study Frappier and her team examined PML nuclear bodies and proteins in EBV-positive and EBV-negative NPC cells. Manipulation of EBNA1 levels in each cell type clearly showed that EBNA1 expression induces the loss of PML proteins and PML nuclear bodies through an association of EBNA1 with the PML bodies. PML nuclear bodies are known to have tumor-suppressive effects due to their roles in regulating DNA repair and programmed cell death, and accordingly, EBNA1 was shown to interfere with these processes.

The researchers conclude that there is "an important role for EBNA1 in the development of NPC, in which EBNA1-mediated disruption of PML nuclear bodies promotes the survival of cells with DNA damage." Since EBNA1 is expressed in all EBV-associated tumors, including B-cell lymphomas and gastric carcinoma, these findings raise the possibility that EBNA1 could play a similar role in the development of these cancers. The cellular effects of EBNA1 in other EBV-induced cancers will require further investigation.

http://www.plospathogens.org/doi/ppat.1000170 (link will go live on Friday, October 3)

CITATION: Sivachandran N, Sarkari F, Frappier L (2008) Epstein-Barr Nuclear Antigen 1 Contributes to Nasopharyngeal Carcinoma through Disruption of PML Nuclear Bodies. PLoS Pathog 4(10): e1000170. doi:10.1371/journal.ppat.1000170

Sharpening up Jupiter

New image-correction technique delivers sharpest whole-planet ground-based picture ever

Amazing image of Jupiter taken in infrared light on the night of Aug. 17, 2008, with the Multi-Conjugate

A record two-hour observation of Jupiter using a superior technique to remove atmospheric blur has produced the sharpest whole-planet picture ever taken from the ground. The series of 265 snapshots obtained with the Multi-Conjugate Adaptive Optics Demonstrator (MAD) prototype instrument mounted on ESO's Very Large Telescope (VLT) reveal changes in Jupiter's smog-like haze, probably in response to a planet-wide upheaval more than a year ago.

Being able to correct wide field images for atmospheric distortions has been the dream of scientists and engineers for decades. The new images of Jupiter prove the value of the advanced technology used by MAD, which uses two or more guide stars instead of one as references to remove the blur caused by atmospheric turbulence over a field of view thirty times larger than existing techniques [1].

"This type of adaptive optics has a big advantage for looking at large objects, such as planets, star clusters or nebulae," says lead researcher Franck Marchis, from UC Berkeley and the SETI Institute in Mountain View, California, USA. "While regular adaptive optics provides excellent correction in a small field of view, MAD provides good correction over a larger area of sky. And in fact, were it not for MAD, we would not have been able to perform these amazing observations."

MAD allowed the researchers to observe Jupiter for almost two hours on 16 and 17 August 2008, a record duration, according to the observing team. Conventional adaptive optics systems using a single Jupiter moon as reference cannot monitor Jupiter for so long because the moon moves too far from the planet. The Hubble Space Telescope cannot observe Jupiter continuously for more than about 50 minutes, because its view is regularly blocked by the Earth during Hubble's 96-minute orbit.

Amazing image of Jupiter taken in infrared light on the night of Aug. 17, 2008, with the Multi-Conjugate Adaptive Optics Demonstrator prototype instrument mounted on ESO's Very Large Telescope. This false colour photo is the combination of a series of images taken over a time span of about 20 minutes, through three different filters (2, 2.14, and 2.16 microns). The image sharpening obtained is about 90 milli-arcseconds across the whole planetary disc, a real record on similar images taken from the ground. This corresponds to seeing details about 300 km wide on the surface of the giant planet. The great red spot is not visible in this image as it was on the other side of the planet during the observations. The observations were done at infrared wavelengths where absorption due to hydrogen and methane is strong. This explains why the colours are different from how we usually see Jupiter in visible-light. This absorption means that light can be reflected back only from high-altitude hazes, and not from deeper clouds. These hazes lie in the very stable upper part of Jupiter's troposphere, where pressures are between 0.15 and 0.3 bar. Mixing is weak within this stable region, so tiny haze particles can survive for days to years, depending on their size and fall speed. Additionally, near the planet's poles, a higher stratospheric haze (light blue regions) is generated by interactions with particles trapped in Jupiter's intense magnetic field. ESO/F. Marchis, M. Wong, E. Marchetti, P. Amico, S. Tordo

Using MAD, ESO astronomer Paola Amico, MAD project manager Enrico Marchetti and Sébastien Tordo from the MAD team tracked two of Jupiter's largest moons, Europa and Io – one on each side of the planet – to provide a good correction across the full disc of the planet. "It was the most challenging observation we performed with MAD, because we had to track with high accuracy two moons moving at different speeds, while simultaneously chasing Jupiter," says Marchetti.

With this unique series of images, the team found a major alteration in the brightness of the equatorial haze, which lies in a 16 000-kilometre wide belt over Jupiter's equator [2]. More sunlight reflecting off upper atmospheric haze means that the amount of haze has increased, or that it has moved up to higher altitudes. "The brightest portion had shifted south by more than 6000 kilometres," explains team member Mike Wong.

This conclusion came after comparison with images taken in 2005 by Wong and colleague Imke de Pater using the Hubble Space Telescope. The Hubble images, taken at infrared wavelengths very close to those used for the VLT study, show more haze in the northern half of the bright Equatorial Zone, while the 2008 VLT images show a clear shift to the south.

"The change we see in the haze could be related to big changes in cloud patterns associated with last year's planet-wide upheaval, but we need to look at more data to narrow down precisely when the changes occurred," declares Wong.

A taste for scorpion venom could be cancer's undoing

RADIOACTIVE scorpion venom sounds like the ultimate doomsday weapon but it is now being tested as a treatment for malignant brain cancer.

The scorpion Leiurus quinquestriatus lives in the Middle East. Among the powerful cocktail of neurotoxins packed into its venom is a peptide that is non-toxic to humans and binds to a receptor found only on some tumour cells. In culture, the peptide has invaded tumours in breast, skin, brain and lung tissue, but left healthy cells untouched. "It's as if the tumours collect it," says Michael Egan of the company TransMolecular in Cambridge, Massachusetts. To see if the peptide could deliver lethal doses of radioactivity to cancer cells, researchers at the company have attached radioactive iodine isotopes to it.

In a trial last year, they injected this agent directly into the tumours of 59 people suffering from inoperable brain cancer. All the patients have now died, but those receiving a higher dose lived for three months longer, on average.

In recent weeks, researchers at the University of Chicago in Illinois have begun injecting TM601 into the bloodstream of people with different types of malignant brain cancer. This latest trial will allow the company to test whether TM601 can seek out and kill secondary tumours throughout the body, as well as known primary ones.

Liver transplant recipients almost 3 times more likely to develop cancer

Cancer incidence is higher among liver transplant recipients in Finland compared to the general population, according to a new study in the October issue of Liver Transplantation, a journal by John Wiley & Sons. The article is also available online at Wiley Interscience (www.interscience.wiley.com).

Transplantation, and subsequent immunosuppression which keeps rejection at bay, have long been associated with increased cancer risk. Several studies have examined the issue, but few have used a control population for comparison, and many rely on limited data. More studies are needed to reliably reveal the cancer risk pattern after transplantation, so doctors can optimize immunosuppression, cancer surveillance and risk management.

Researchers, led by Helena Isoniemi of Finland , sought to describe the cancer risk pattern in Finnish liver transplant patients, hypothesizing that the incidence of specific types of cancer would be higher among the recipients. They included all liver transplant patients from Helsinki University Central Hospital transplanted between 1982 and 2005. Using the Finnish Population Register and the national Cancer Registry, they were able to follow-up on each patient beginning at the date of transplant through the end of 2005.

Among the 540 liver transplant recipients, they found a total of 39 post-transplant de novo cancers in 36 patients. The overall standardized incidence ratio (SIR) compared to the general population was 2.59. Non-Hodgkin lymphoma, non-melanoma skin cancer and basal cell carcinoma had significantly elevated SIRs.

"The most common cancer types in our cohort were lymphoma and skin cancer," the authors report. "Non-Hodgkin lymphoma, which included four cases of post-transplant lymphoproliferative disorder, occurred more frequently in males, in patients transplanted at a younger age and soon after transplantation." By contrast, non-melanoma skin cancer was more common among older patients and those who had antibody induction therapy. Interestingly, the authors found lower cancer incidence among patients with history of acute rejections, correlating most strongly with lymphomas.

"Based on our data, one out of six liver transplant patients is estimated to develop some form of cancer by 20 years after transplantation." The authors report. "This study points out the importance of cancer surveillance after liver transplantation."

An accompanying editorial by Ashokkumar Jain of the University of Rochester et. al. reviews the Aberg et al findings alongside the rest of the literature, looking closely at patient age and duration of follow-up. Aberg and colleagues "show that the cumulative incidence of de novo cancers increased at 1, 5, 10 and 20 years of follow up from 3 percent, 5 percent, 13 percent and 16 percent respectively," Jain writes.

He also pointed out that other reports have noted a significantly increased risk of de novo oropharyngeal and lung cancers amongst liver transplant patients that smoke, which is a potentially preventable condition.

Throughout the literature, Jain and his coauthors found wide variation in the reported incidence of post transplant cancers, partly related to the length of follow up and partly related to the inclusion or exclusion of lymphoid lesions.

"The overall rate of de novo solid tumors increased with age at the time of transplant and the length of follow up; while the rate of post-transplant lympho-proliferative disorders decreased with age at the liver transplant, with a higher incidence in the first few years," they conclude.



Rethinking Who Should Be Considered 'Essential' During a Pandemic Flu Outbreak

Not only are doctors, nurses, and firefighters essential during a severe pandemic influenza outbreak. So, too, are truck drivers, communications personnel, and utility workers. That's the conclusion of a Johns Hopkins University article to be published in the journal of Biosecurity and Bioterrorism. The report, led by Nancy Kass, Sc.D, Deputy Director of Public Health for the Johns Hopkins Berman Institute of Bioethics, provides ethical guidance for pandemic planning that ensures a skeletal infrastructure remain intact at all times. Dr. Kass says, “when preparing for a severe pandemic flu it is crucial for leaders to recognize that if the public has limited or no access to food, water, sewage systems, fuel and communications, the secondary consequences may cause greater sickness death and social breakdown than the virus itself.”

The authors represent a wide-range of expertise in several areas of pandemic emergency planning both at the state and federal levels. After examining several accepted public health rationing strategies that give priority to all healthcare workers and those most susceptible to illness, the authors propose a new strategy that gives priority to a more diverse group. “Alongside healthcare workers and first responders, priority should be given to the people who provide the public with basic essentials for good health and well-being, ranging from grocery store employees and communications personnel to truck drivers and utility workers,” says Dr. Kass.

The report recognizes that given the widespread and sustained nature of a pandemic, federal assistance will be spread thin and local jurisdictions must develop their own preparedness plans to ensure they are capable of sustained self-sufficiency. Encouraging and working with local businesses to develop their own response plans can help reduce the burden on local governments during a pandemic. Similarly, individuals and families who can afford it should do their best to prepare for any disaster. The paper notes, the more initiative the general public exercises in stockpiling several weeks' worth of food, water, paper goods, batteries medicines, and other needed supplies, the less vulnerable they will be to a break in the supply chain. In fact, the report emphasizes, it is important for leaders to communicate to the middle class and the wealthy that it is their responsibility to prepare for self-sufficiency in order to free up scarce supplies and allow first responders to direct their attention towards those too poor or vulnerable to prepare themselves.



The article lays out a set of ethics rules and principles to help guide and frame a pandemic response strategy that is evidence-based, transparent, fair, and recognizes the burdens the public may face. Dr. Kass points out the “consideration of ethics are critical not only in having respectful and inclusive discussion and engaging with the public fairly, but it also improves the likelihood of public health and medical success through increased cooperation and understanding of government plans.”

Other authors of this paper include: Jean Otto, DrPH, Senior Epidemiologist, Department of Defense, Global Emerging Infections Surveillance and Response System, Armed Forces Health Surveillance Center, Walter Reed Army Institute of Research; Daniel O'Brien, JD, Principal Counsel, Office of the Maryland Attorney General, Department of Health and Mental Hygiene; and Mathew Minson, MD, Senior Medical Officer for Strategic Initiatives, Office of the Assistant Secretary for Preparedness and Response, U.S. Department of Health and Human Services.

'Little bang' triggered solar system formation

Washington, D.C.—For several decades, scientists have thought that the Solar System formed as a result of a shock wave from an exploding star—a supernova—that triggered the collapse of a dense, dusty gas cloud that contracted to form the Sun and the planets. But detailed models of this formation process have only worked under the simplifying assumption that the temperatures during the violent events remained constant. Now, astrophysicists at the Carnegie Institution's Department of Terrestrial Magnetism (DTM) have shown for the first time that a supernova could indeed have triggered the Solar System's formation under the more likely conditions of rapid heating and cooling. The results, published in the October 20, 2008, issue of the Astrophysical Journal, have resolved this long-standing debate.

"We've had chemical evidence from meteorites that points to a supernova triggering our Solar System's formation since the 1970s," remarked lead author, Carnegie's Alan Boss. "But the devil has been in the details. Until this study, scientists have not been able to work out a self-consistent scenario, where collapse is triggered at the same time that newly created isotopes from the supernova are injected into the collapsing cloud."

Short-lived radioactive isotopes—versions of elements with the same number of protons, but a different number of neutrons—found in very old meteorites decay on time scales of millions of years and turn into different (so-called daughter) elements. Finding the daughter elements in primitive meteorites implies that the parent short-lived radioisotopes must have been created only a million or so years before the meteorites themselves were formed. "One of these parent isotopes, iron-60, can be made in significant amounts only in the potent nuclear furnaces of massive or evolved stars," explained Boss. "Iron-60 decays into nickel-60, and nickel-60 has been found in primitive meteorites. So we've known where and when the parent isotope was made, but not how it got here."



Cross-sectional view of one-half of a solar-mass target cloud being struck by a supernova shock front that is traveling downward. The colors represent the target cloud, with redder colors representing denser regions. The solid black contours delineate material that was originally in the supernova shock front, where short-lived radioisotopes are being injected into the collapsing target cloud.

Previous models by Boss and former DTM Fellow Prudence Foster showed that the isotopes could be deposited into a pre-solar cloud if a shock wave from a supernova explosion slowed to 6 to 25 miles per second and the wave and cloud had a constant temperature of -440 °F (10 K). "Those models didn't work if the material was heated by compression and cooled by radiation, and this conundrum has left serious doubts in the community about whether a supernova shock started these events over four billion years ago or not," remarked Harri Vanhala, who found the negative result in his Ph.D. thesis work at the Harvard-Smithsonian Center for Astrophysics in 1997.

Using an adaptive mesh refinement hydrodynamics code, FLASH2.5, designed to handle shock fronts, as well as an improved cooling law, the Carnegie researchers considered several different situations. In all of the models, the shock front struck a pre-solar cloud with the mass of our Sun, consisting of dust, water, carbon monoxide, and molecular hydrogen, reaching temperatures as high as 1,340°F (1000 K). In the absence of cooling, the cloud could not collapse. However, with the new cooling law, they found that after 100,000 years the pre-solar cloud was 1,000 times denser than before, and that heat from the shock front was rapidly lost, resulting in only a thin layer with temperatures close to 1,340°F (1000 K). After 160,000 years, the cloud center had collapsed to become a million times denser, forming the protosun. The researchers found that isotopes from the shock front were mixed into the protosun in a manner consistent with their origin in a supernova.

"This is the first time a detailed model for a supernova triggering the formation of our solar system has been shown to work,'' said Boss. "We started with a Little Bang 9 billion years after the Big Bang."

This research was supported in part by the NASA Origins of Solar Systems and Planetary Geology and Geophysics Programs and in part by the NASA Astrobiology Institute. The software used in this work was in part developed by the DOE-supported ASC/Alliances Center for Astrophysical Thermonuclear Flashes at the University of Chicago.

Second lumpectomy for breast cancer reduces survival rates

UC Davis researchers find disturbing trend in treating recurrent breast cancer

SACRAMENTO, Calif. - A majority of women with breast cancer today are candidates for lumpectomy, allowing for conservation of most of their breast tissue. Results of a UC Davis study, however, show that a number of women whose cancer recurs in the same breast are treated with a second lumpectomy rather than a mastectomy, defying current treatment recommendations and cutting the number of years those women survive in half.

"We were surprised to find that so many women in our study — almost a quarter of them — had received another lumpectomy rather than a mastectomy," said Steven Chen, a UC Davis Cancer Center surgical oncologist and lead author of the study, which appears in the October issue of the American Journal of Surgery. "It's likely that patients are asking for lumpectomies when their cancer is diagnosed a second time, and their doctors are simply complying with that request. Whatever the reason, that decision can shorten life spans."

Chen and study co-author, Steve Martinez, also a UC Davis Cancer Center surgical oncologist, gathered data from the National Cancer Institute's Surveillance, Epidemiology and End Results database, which includes information on all cancers diagnosed in selected regions throughout the nation. Their study included 747 patients who previously received breast-conservation therapy and were diagnosed with cancer a second time in the same breast between 1988 and 2004.

The authors found that women who had mastectomies had a 78 percent survival rate after five years, while those who had second lumpectomies had a 67 percent survival rate. The 10-year survival rates were 62 percent for those who had mastectomies and 57 percent for those who had second lumpectomies. In all, 24 percent of women with recurrent breast cancer in the same breast had second lumpectomies.

The researchers went on to calculate the risk of dying for mastectomy patients compared to lumpectomy patients. They found that, after adjusting for factors that affect survival, there will be half as many survivors at any given time in the lumpectomy group versus the mastectomy group.

Chen explained that a mastectomy is the generally accepted surgical treatment for a second cancer because whole breast radiation, which typically accompanies a lumpectomy, is not usually recommended twice in a lifetime. This new study shows as well that there is a survival advantage to those who choose a mastectomy.

According to Martinez, knowledge of breast cancer and its treatments are continuously advancing, and second lumpectomies could at some point become a viable option.

"As therapy for breast cancer becomes more targeted and researchers come closer to identifying those factors that make some breast cancers more aggressive than others, we may have the option of recommending second and even third lumpectomies in select cases in the future. Until then, mastectomy remains the best option for women experiencing a same-breast recurrence of their breast cancer," he said.

Breast cancer is currently the most common newly diagnosed malignancy among American women. The chance of developing invasive breast cancer at some time in a woman's life is about 1 in 8. In the United States in 2008, an estimated 182,460 new cases of invasive breast cancer will be diagnosed, an additional 67,770 new cases of carcinoma in situ — or "pre-cancer" — will be discovered and 40,480 women will die from breast cancer.



Musicians use both sides of their brains more frequently than average people

NASHVILLE, Tenn.--Supporting what many of us who are not musically talented have often felt, new research reveals that trained musicians really do think differently than the rest of us. Vanderbilt University psychologists have found that professionally trained musicians more effectively use a creative technique called divergent thinking, and also use both the left and the right sides of their frontal cortex more heavily than the average person.

The research by Crystal Gibson, Bradley Folley and Sohee Park is currently in press at the journal Brain and Cognition.

"We were interested in how individuals who are naturally creative look at problems that are best solved by thinking 'out of the box'," Folley said. "We studied musicians because creative thinking is part of their daily experience, and we found that there were qualitative differences in the types of answers they gave to problems and in their associated brain activity."

One possible explanation the researchers offer for the musicians' elevated use of both brain hemispheres is that many musicians must be able to use both hands independently to play their instruments.

"Musicians may be particularly good at efficiently accessing and integrating competing information from both hemispheres," Folley said. "Instrumental musicians often integrate different melodic lines with both hands into a single musical piece, and they have to be very good at simultaneously reading the musical symbols, which are like left-hemisphere-based language, and integrating the written music with their own interpretation, which has been linked to the right hemisphere."

Previous studies of creativity have focused on divergent thinking, which is the ability to come up with new solutions to open-ended, multifaceted problems. Highly creative individuals often display more divergent thinking than their less creative counterparts.

To conduct the study, the researchers recruited 20 classical music students from the Vanderbilt Blair School of Music and 20 non-musicians from a Vanderbilt introductory psychology course. The musicians each had at least eight years of training. The instruments they played included the piano, woodwind, string and percussion instruments. The groups were matched based on age, gender, education, sex, high school grades and SAT scores.

The researchers conducted two experiments to compare the creative thinking processes of the musicians and the control subjects. In the first experiment, the researchers showed the research subjects a variety of household objects and asked them to make up new functions for them, and also gave them a written word association test. The musicians gave more correct responses than non-musicians on the word association test, which the researchers believe may be attributed to enhanced verbal ability among musicians. The musicians also suggested more novel uses for the household objects than their non-musical counterparts.

In the second experiment, the two groups again were asked to identify new uses for everyday objects as well as to perform a basic control task while the activity in their prefrontal lobes was monitored using a brain scanning technique called near-infrared spectroscopy, or NIRS. NIRS measures changes in blood oxygenation in the cortex while an individual is performing a cognitive task.

"When we measured subjects' prefrontal cortical activity while completing the alternate uses task, we found that trained musicians had greater activity in both sides of their frontal lobes. Because we equated musicians and non-musicians in terms of their performance, this finding was not simply due to the musicians inventing more uses; there seems to be a qualitative difference in how they think about this information," Folley said.

The researchers also found that, overall, the musicians had higher IQ scores than the non-musicians, supporting recent studies that intensive musical training is associated with an elevated IQ score.

The research was partially supported by a Vanderbilt University Discovery Grant.

Why your boss is white, middle-class and a show-off

The way male managers power dress, posture and exercise power is due to humans' evolutionary biology, according to research from the University of New South Wales (UNSW).

Prehistoric behaviours, such as male domination, protecting what is perceived as their "turf" and ostracising those who do not agree with the group is more commonplace in everyday work situations than many of us want to accept, according to the research which was carried out in hospitals.

"This tribal culture is similar to what we would have seen in hunter gather bands on the savannah in southern Africa," says the author of the paper, Professor Jeffrey Braithwaite, from UNSW's Institute for Health Innovation.

"While this research focuses specifically on health care settings, the results can be extrapolated to other workplaces," says Professor Braithwaite.

"Groups were territorial in the past because it helped them survive. If you weren't in a tight band, you didn't get to pass on your genes," he says. "Such tribalism is not necessary in the same way now, yet we still have those characteristics because they have evolved over two million years.

"It's a surprise just how hard-wired this behaviour is," says Professor Braithwaite. "It's predictable that a group will ostracise a whistleblower, for instance. It's not good, but it's understandable in the tribal framework. It explains all sorts of undesirable behaviours, including bullying."

Professor Braithwaite's research is based on hundreds of interviews and observations of health workers over a 15-year period. He used an evolutionary psychology approach – incorporating archaeology and anthropology of the earliest known humans – to compare with modern behaviours.

It is hoped the research can be used to develop strategies to encourage clinical professionals to work together more effectively.

"We need to stop being simplistic and realise that changing behaviours and encouraging teamwork is much harder than we think," says Professor Braithwaite. "Getting different groups together and talking through some of the differences, and appreciating some of the unwritten rules which drive people, are crucial steps in improving trust.

"We also need to re-think education. We train doctors in a completely different arena from nurses and allied health staff, then we bring them together in the workplace after they graduate and expect everyone to be team players," he says. "We need to bring them together much earlier in the educational process."

Other features include:

* Meetings are held in the most senior manager's office, who typically dominates proceedings

* Managers do not spend as much of their time as people think sitting reading quietly, or attending to paperwork in front of a computer. They are out there manoeuvring and positioning at meetings, one-on-one encounters and coffee cliques.

* Managers rarely take lunch or tea breaks

* Non-managerial staff regularly take an allocated period of time for breaks

The paper has just been published in the Journal of Health Organisation and Management.



'Coca-Cola douches' scoop Ig Nobel prize

* 00:30 03 October 2008

* NewScientist.com news service

* Jeff Hecht

Tests of whether sodas such as Coke and Pepsi could be used as spermicides were among the many offbeat ideas celebrated at the 2008 Ig Nobel awards on Thursday. Lap dancers' tips and armadillos' uncanny ability to wreak havoc at archaeological sites were also the subjects of prize-winning studies.

The tongue-in-cheek awards, presented at Harvard University, are organised by the humorous scientific journal the Annals of Improbable Research for research achievements "that make people laugh – then think".

Deborah Anderson of Harvard Medical School's birth-control laboratory took her first step towards the Ig Nobel chemistry prize in the 1980s when she asked medical student Sharee Umpierre what type of contraception had been used at the all-girl Catholic boarding school she had attended in Puerto Rico.

"Coca-Cola douches," Umpierre replied. Though that was the first Anderson had heard of the idea, her gynaecologist colleague, Joe Hill, remembered a song of the same name by an outrageous 1960s band called The Fugs.

"Coca-Cola douches had become a part of contraceptive folklore during the 1950s and 1960s, when other birth-control methods were hard to come by," Anderson told New Scientist. "It was believed that the carbonic acid in Coke killed sperm, and the method came with its own 'shake and shoot applicator'" – the classic Coke bottle.


1   2   3   4   5   6   7


Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©azrefs.org 2016
rəhbərliyinə müraciət

    Ana səhifə