13 April 2007

Axis introduces compact, video encoder with Power over Ethernet for discreet video surveillance


LUND, Sweden. – (March 26, 2007) – Axis Communications today introduced the ultra-compact AXIS 247S Video Server that is designed for discreet or space-restrictive video surveillance applications providing added value for locations such as stores, banks, and government buildings. The high performance AXIS 247S makes installations easy since it receives power through the same Ethernet cable as for data and can further supply power to the attached analog camera, thus eliminating the need for power outlets.

“The AXIS 247S provides a compact and flexible solution that integrates a miniature or standard analog camera into a high performance, IP-based video surveillance system,” said Anders Laurin, Axis’ executive vice president of corporate strategy. “This enables analog camera users to take advantage of all the benefits that digital technology offers.”

The AXIS 247S converts an analog video stream into high quality, full frame rate digital video. The video encoder can deliver Motion JPEG and advanced MPEG-4 streams simultaneously, providing the ability to optimize for both image quality and bandwidth. It also provides video motion detection, one-way audio for listening in on an area, and audio detection alarm.

Given the encoder’s small size and support for Power over Ethernet (PoE), relocating and setting up surveillance is quick and easy. Positioning the AXIS 247S close to an analog camera also eliminates the loss in image quality that would occur if video were to be transferred over long distances through a coaxial cable instead of an IP network.

Axis video encoders offer the market’s most comprehensive set of network capabilities, optimizing the network video solution for security, efficiency and manageability. An example is the IPv6 support, which insures against the growing shortage of IPv4 addresses and eliminates problems with static IP address allocation.

The AXIS 247S is supported by the industry’s largest base of application software through the Axis Application Development Partner program, as well as the AXIS Camera Station video management software. The product is available through Axis’ distribution channels in early Q2 2007.

About Axis
Axis is an IT company offering network video solutions for professional installations. The company is the global market leader in network video, driving the ongoing shift from analog to digital video surveillance. Axis products and solutions focus on security surveillance and remote monitoring, and are based on innovative, open technology platforms.

Axis is a Swedish-based company, operating worldwide with offices in 18 countries and cooperating with partners in more than 70 countries. Founded in 1984, Axis is listed on the Nordic List, Mid Cap and Information Technology exchanges. For more information about Axis, please visit our website at www.axis.com.

31 March 2007

The Beats Go On, IPODS

Two years ago, Dr. Michael Barrett had a cool idea for taking his Temple University medical school classes into the high-tech future—or so he thought. He'd been teaching students to recognize the distinctive sounds of heart murmurs by playing recordings in class. "We'd give a one-hour lecture, play each sound for them, and say, 'That's your murmur, guys,'" he recalls. "Then they'd look at us blankly and file out of the room." Barrett's idea was to give his students CDs loaded with heart sounds to listen to at home. Alas, this proposal garnered equally blank looks from his students. "I was surprised," he says, "but they were like, 'Dr. Barrett, nobody listens to CDs anymore.'"
Barrett didn't own an iPod at the time, and he says that "90 percent of practicing doctors over 30 probably still don't." But his students do, and today they have beat-heavy tracks like "innocent systolic murmur," "aortic regurgitation" and "mitral stenosis" at the top of their playlists. They've put the so-called "heart songs" on repeat and listened to them at the gym, on their commutes, while walking around town. In the process, apparently, they've become better doctors.
This weekend at the annual conference of the American College of Cardiology, Barrett reported that cardiology (or at least one component of it) can be taught by iPod. After listening to MP3s of 400 or more heartbeats, manipulated to sound either healthy or abnormal, students were able to easily identify sounds in patients that might signal trouble. Before testing out Barrett's program, another group of internists managed to identify only 40 percent of murmurs correctly; afterwards, they got 80 percent right. "Usually, the first time you try to listen for a heart murmur you're lucky if you hear a heartbeat at all," says Jodi Washinsky, a fourth-year med student at Tufts. "But this has really helped me. I've found myself picking up a lot of murmurs in patients, now that I actually know what I'm listening for."
What's amazing isn't necessarily that doctors are using iPods as teaching tools—it's that they've taken so long to catch on. In 2005, Duke University gave all incoming undergraduates their own iPods, and many other schools have signed up for Apple's "iTunes U," which allows them to download podcasts of lectures and other course materials. But medicine, particularly cardiology, has lagged behind, says Barrett. "We as cardiologists haven't done a very good job of teaching these heart sounds to medical students and even residents," he says. "In the past, we would have just played short recordings in class, or we might have waited for a patient to show up with a perfect murmur and then we'd line all the students up in front of him. If they were lucky, they'd get to listen for 30 seconds each."
That approach just doesn't work, says Barrett, because the key to learning heart murmurs is repetition. "Learning these sounds is not an intellectual skill. It's a technical skill, like tying knots," he says. "And the way you learn a technical skill is by repeating it, and repeating it, and repeating it." Barrett surveyed students and found that 400 repetitions of a heartbeat were enough to drill into a student's head the particular rhythm of a murmur: the "lub-dub" of a healthy heart, the "lub-whoosh" of aortic regurgitation, or the "lub-rumble-dub" of aortic stenosis. But 400 heartbeats were too much to ask of a real, live patient, says Barrett: "They won't tolerate sitting there that long."
Instead, Barrett put together computer-generated simulations of those noises, which were clearer and easier to hear than the heartbeat of an actual patient. Then he formatted them into
individual MP3 files, introducing and explaining each one. The recordings were an immediate hit with students. "Time is obviously a problem for medical students, so anything you can learn outside the hospital is great," says Washinsky, who will enter an internal medicine residency next year. "I have a half-hour commute, so I was listening to this a lot on the train, and at home too, if I was going to bed and didn't feel like watching TV."
Not only were the "heart songs" convenient, they worked. Barrett has mostly given them to younger docs and students, but he says older physicians can also benefit from adding the MP3s to their music libraries. He'll be testing out willing participants' abilities at the conference this weekend, and for those who don't make it there, the ACC offers a collection of recordings for download on its Web page. (For those who still haven't broken down and bought iPods, the recordings also come as Barrett originally envisioned them, in CD form.) Even those of us without stethoscopes can learn a heart song, well, by heart, says Barrett. "I had a feeling this was a skill you could pick up without any medical background at all," he says, "so I secretly put a marketing major in our classes and had him learn these heartbeats on his iPod. At the end of the class he scored equal to any of the med students. This kid couldn't spell 'mitral regurgitation,' but he could learn to recognize it."

Apple iTunes offers 'Complete My Album'

SAN JOSE, Calif. - Apple Inc., the company that popularized selling songs online for 99 cents apiece, now hopes to buoy interest in albums, giving customers credit for purchases of full albums from which they have bought individual tracks.
Apple introduced the "Complete My Album" feature Thursday on its iTunes Store. It now gives a full credit of 99 cents for every track the user previously purchased and applies it toward the purchase of the complete album.
For instance, most albums on iTunes cost $9.99 so a customer who already bought three tracks can download the rest of the album for $7.02.
Previously, users who bought singles and later opted to buy the album had to pay the full price of the album and ended up with duplicates of those songs.
The album price reduction is good for only 180 days after the initial purchase of individual tracks.
Eddy Cue, Apple's vice president of iTunes, said the new feature should help eliminate the resistance that customers, including himself, may have felt in buying an album after they had already bought a single from it.
"Once we bought a song, we wondered why we had to buy it again if we wanted the album," Cue said. "We hope it helps us sell more songs ultimately, and from the customer point of the view, we think it's the right thing to do."
About 45 percent of the nearly 2.5 billion songs sold on iTunes were purchased as albums, Cue said.
For a limited period of 90 days, Apple said it will make the "Complete My Album" offer retroactive to users who purchased tracks dating back to the launch of the iTunes Store four years ago.
Apple dominates the online music market and is a leading music retailer worldwide behind only Wal-Mart Stores Inc., Best Buy Co. and Target Corp.

Microsoft unveils new mobile Web browser

REDMOND, Wash. - Microsoft Corp. has unveiled an early version of a new Web browser for mobile devices that it said will make browsing full-sized Web pages faster and easier on small smart-phone screens.

To date, most Web browsers for mobile phones work best with pared-down versions of existing sites, limiting mobile users' access to the Internet to a sliver of what's available to desktop Web surfers.
Microsoft said on the Live Labs Web site that the Deepfish technology is in very early stages, and "still a few releases from beta quality."
Deepfish's unveiling Wednesday followed another mobile Web browser development from Microsoft. ZenZui, a startup that used technology developed at Microsoft's research lab, on Monday showed off a visual way to store and navigate bookmarked Web pages on a mobile phone.
For now, a limited number of users with smart phones or Pocket PCs running Windows Mobile 5.0 or later can download Deepfish from the Live Labs Web site.

25 March 2007

Protein Factory Reveals Its Secrets (Part 4 End)


A similar proton-shuttle mechanism had been proposed earlier by professor of theoretical chemistry Johan Åqvist of Uppsala University, in Sweden, and coworkers. They based their proposal on molecular dynamics and combined quantum mechanical/molecular mechanics simulations. Similar simulations by chemistry professor Arieh Warshel and coworkers at the University of Southern California, Los Angeles, support the proton-shuttle mechanism as well, although the USC group found that electrostatic stabilization-not substrate assistance or orientational entropy-accounts for most of the catalytic effect.
Computation and simulation are likely to be extraordinarily useful for further clarifying the ribosome's mechanism, because these techniques currently represent "the only way to study the ribosome in motion in atomic detail," says theoretical biologist
Kevin Sanbonmatsu of Los Alamos National Laboratory. Sanbonmatsu and coworkers have used a supercomputer to simulate a working ribosome, identifying eight new potential antibiotic target sites. "The study is the largest simulation performed to date in biology," Sanbonmatsu says.
Last year, Yonath, chemistry professor
Lou Massa of Hunter College of the City University of New York, crystallographer Jerome Karle of the Naval Research Laboratory, Washington, D.C., and coworkers turned to density functional theory to model ribosome catalysis. They reported a quantum mechanical transition state for peptide bond formation in the ribosome. The study also "defined the activation energy of the reaction and identified ribosomal interactions that seem to stabilize the transition state, which is formed while the A-site tRNA is rotating into the P site," Yonath says.
Such revelations notwithstanding, the ribosome continues to hold onto a few secrets. "There have always been researchers who think that we understand how the ribosome works," Noller says. However, "at this point, in spite of high-resolution crystal structures and decades of biochemical, genetic, and biophysical studies, I don't think we understand the fundamental mechanisms at all," he says,
"How do tRNAs and mRNA move during translocation, a process that involves molecular movements of many tens of angstroms every 50 milliseconds or so?" Noller asks. "What is the role of EF-G in that process? How does EF-Tu speed up binding of aminoacyl-tRNA to the A site by several thousandfold? The ribosome is enormous and tremendously conserved phylogenetically, yet the things we claim to understand at this point involve only a handful of nucleotides."
"The questions are getting finer, and they're also getting harder to ask," Strobel notes. "Where one person says we have the answers, the next person says we have the questions."

Protein Factory Reveals Its Secrets (Part 3)

Also in 2000, Yonath and coworkers obtained a structure of the small subunit and analyzed its structure with each of four antibiotics bound. The study revealed the antibiotics' binding sites and enabled the researchers to propose modes of action for these drugs. The team also obtained a structure of the large ribosomal subunit.
About the same time, structural biologist
Venki Ramakrishnan of the MRC Laboratory of Molecular Biology at the University of Cambridge and coworkers obtained a crystal structure of the small subunit and also determined its structure with different antibiotics bound to it. Their study focused particularly on quality control of the decoding process, the way the ribosome checks codon-anticodon interactions between mRNA and newly arrived tRNAs.
The ribosome is strict about correct base pairing between the first two positions of three-nucleotide codons and anticodons. But it is more tolerant at the third; in fact, a handful of different synonymous codons (C&EN, Jan. 22, page 38) that differ only in their third positions can encode a single amino acid. The study revealed the structural basis for this redundancy in the genetic code. "We showed how the ribosome can discriminate between correct and incorrect tRNAs," Ramakrishnan says.
In 2001, a 5.5-Å resolution map of a whole ribosome with mRNA bound and tRNAs in the A, P, and E sites was obtained by Noller;
Jamie H. D. Cate, now associate professor of chemistry, biochemistry, and molecular biology at UC Berkeley; Marat Yusupov of the Structural Biology & Genomics Laboratory, Strasbourg, France; and coworkers. The work revealed more about the relative orientation of the two subunits and their interactions with tRNAs. More recently, Cate and coworkers independently published a map of the whole ribosome, and Yusupov and coworkers independently obtained the structure of the ribosome with mRNA bound.
Last year, three more structures of the whole ribosome appeared. Ramakrishnan and coworkers obtained a 2.8-Å structure of the ribosome with mRNA and tRNAs bound. The study revealed that a kink in mRNA between the A and P sites is probably essential for maintaining the correct mRNA reading frame during translation. Noller and coworkers mapped the whole ribosome at 3.7-Å resolution with an mRNA mimic and two tRNAs in place. And Cate and coworkers obtained a 3.5-Å structure of the ribosome with mRNA and a tRNA mimic attached.
Such structural studies have opened the floodgates for a range of biochemical and computational research on the mechanism of action of the ribosome. "Lots of tidbits about the ribosome were out there already, but the structural work is what's 'crystallized' it all," says
Rachel Green, professor of molecular biology and genetics at Johns Hopkins School of Medicine. "It's led to all the biochemistry that our group and several others have done on the ribosome."
For example, the idea that a specific RNA nucleotide in the ribosome active site accelerates peptidyl transfer by acting as a base was proposed by Steitz, Moore, molecular biophysics and biochemistry professor
Scott A. Strobel, and coworkers at Yale. But results of mutational studies by Rodnina and coworkers and by Green's group contradicted that idea.
So did a subsequent study in which the entropy and enthalpy of the ribosome reaction were assessed by Rodnina;
Richard V. Wolfenden, professor of chemistry, biochemistry, and biophysics at the University of North Carolina, Chapel Hill; and coworkers. If the peptide transfer reaction were base-catalyzed, it would be expected to have a large enthalpic component. But Rodnina, Wolfenden, and coworkers found that the origin of the 107-fold rate enhancement produced by the ribosome is entirely entropic and due to juxtaposition or desolvation of the substrates, not to base catalysis.
"The view accepted by most people now is that the active-site nucleotide does not play a dramatic role in peptidyl transfer," Green says. The general consensus, she says, is that orientation and positioning of substrates by the ribosome structure accelerates ribosome catalysis much more than any specific chemical effect.
Although the ribosome per se may not promote peptidyl transfer in a very active chemical manner, it's possible that the P-site tRNA substrate does catalyze the reaction-a proposed case of substrate-assisted catalysis. The growing protein chain is attached by an ester to a 3′-hydroxyl on one of the P-site tRNA nucleotide residues. In 2004, Strobel, Green, and coworkers confirmed earlier hints that peptidyl transfer is accelerated by a neighboring 2′-hydroxyl group on the same nucleotide. They found that deleting that 2′-hydroxyl causes a millionfold reduction in ribosome catalytic activity.
"The current model that everybody's discussing and likes is that that 2′-hydroxyl is essentially acting as a proton shuttle," Green says. The proton released from the A-site tRNA's nucleophilic amine apparently gets passed along to the 2′-hydroxyl. From there, it's passed to the protein's ester leaving group, which needs a proton to balance its negative charge. Strobel's group is currently carrying out experiments to further test the proposal.

Protein Factory Reveals Its Secrets (Part 2)

As peptide bond formation occurs, the mRNA and tRNAs translocate-that is, they shimmy over one codon length. The P-site tRNA moves to the E site, where it gets ready to leave the ribosome, and the A-site tRNA moves to the P site, opening a space on the A site for a new tRNA. When the ribosome reaches an mRNA codon that signals a stop, the protein chain is released for use by the cell. The protein makes its getaway from the ribosome through a tunnel in the large subunit.
The ribosome doesn't carry out protein translation all by itself. It gets assistance from cofactors like EF-Tu, which delivers tRNA-amino acid complexes to the ribosome; EF-G, which catalyzes translocation; and release factors, which help synthesized proteins to exit the ribosome.
Over the past few decades, researchers have worked toward a much more detailed understanding of the way the ribosome works on an atomic level. Efforts go back at least to the 1960s, when
Masayasu Nomura, now professor of biological chemistry at the University of California, Irvine, and coworkers showed that the ribosome could assemble spontaneously from its component RNAs and proteins. In the 1970s and '80s, Harry Noller, director of the Center for Molecular Biology of RNA at UC Santa Cruz, and coworkers used chemical modification studies to identify key ribosomal nucleotides and obtained evidence for the idea that translocation occurs in two discrete yet coupled steps.
Structural studies have spearheaded much of the progress since then in understanding the ribosome. Crystals of a ribosome subunit for crystallographic investigation were first made in 1980 by the late biochemist H. G. Wittmann of Max Planck Institute for Molecular Genetics, Berlin; structural biologist
Ada E. Yonath of Weizmann Institute, Rehovot, Israel; and coworkers. Such crystals were initially prone to radiation damage. But in 1986, Yonath's group showed that the analysis of flash-frozen crystals, a technique called cryo-crystallography, can minimize radiation damage to the ribosome. This technique has improved the quality of data from subsequent crystallography of the ribosome and other biomolecules.
In the 1990s, Howard Hughes Medical Institute Investigator
Joachim Frank of both the Wadsworth Center, Albany, N.Y., and the State University of New York, Albany, and his coworkers developed single-particle cryo-electron microscopy (cryo-EM) and began using it to study ribosome structure. Single-particle cryo-EM is a technique for imaging sets of individual molecules embedded in a thin layer of ice. Cryo-EM can be used to observe a greater variety of functionally interesting forms of biomolecules than is possible with crystallography. But cryo-EM can't normally attain atomic resolution, whereas X-ray crystallography can, so Frank and his coworkers often use crystallographic data to refine their cryo-EM maps.
By applying this combined approach to ribosomes, "we have been able to visualize a plethora of different processes that weren't seen before," Frank says. For instance, he and his coworkers showed how tRNA, upon entering the ribosome with EF-Tu, acts like a molecular spring by distorting as it contacts mRNA and then straightening as it enters the A site. They also found that when EF-G binds to the ribosome to induce translocation, the small and large subunits make a ratchet motion by rotating about 10 degrees about each other.
In 2000, several teams of researchers captured atomic (approximately 3-Å resolution) or near-atomic crystal structures of ribosome subunits. A Yale University group led by
Thomas A. Steitz and Peter B. Moore reported a 2.4-Å structure of the large subunit, which continues to be the highest resolution ribosome structure of any kind. This study helped confirm that the ribosome is a ribozyme by showing that RNA predominates in the active site. The group subsequently obtained additional structures of the large subunit bound to various substrates, among them antibiotics and transition-state analogs.

Protein Factory Reveals Its Secrets (Part 1)

Last year, the Nobel Prize in Chemistry heralded work on DNA transcription, a cornerstone process in molecular biology in which a cell synthesizes a messenger RNA (mRNA) version of genomic DNA. For some time, many research teams have been studying the other side of molecular biology's central dogma—the translation of mRNA into protein. That translation occurs on one of nature's most versatile molecular synthesizers: the ribosome.
Harry Noller
View Enlarged Image
Complexity Ribosome structure reveals the system's molecular complexity. A tRNA (orange) is shown base pairing with part of mRNA (gold) on left and extending into the ribosome's peptidyltransferase center on right.
If genomic DNA is the cell's planning authority, then the ribosome is its factory, churning out the proteins of life.
It's a huge complex of protein and RNA with a practical and life-affirming purpose-catalyzing protein synthesis. Bacterial cells typically contain tens of thousands of ribosomes, and eukaryotic cells can contain hundreds of thousands or even a few million of them. The ribosome found in the bacterium Escherichia coli is made up of three RNA components and more than 50 proteins. It weighs about 2.5 million daltons. Eukaryotic versions have four RNAs and about 80 proteins and weigh about 4 MDa.
These dozens of components are all squeezed into two RNA-protein subunits, one small and one large. The ribosome's active site—where proteins are created by the one-at-a-time addition of amino acids to a growing peptide chain—is located in the large subunit. The active site may make protein, but it contains very little protein of its own, with only one of the ribosome's many protein components contributing to the mostly RNA architecture of the active site. Because RNA is so predominant in the active site, the ribosome is widely believed to be an RNA catalyst, or ribozyme—and, in fact, is thought to be the largest known ribozyme.
Understanding how the ribosome works is of fundamental interest, but such knowledge also could prove useful. For example, many antibiotics target bacterial ribosomes, so ribosome research could lead to new types of antibacterial agents. Researchers at the New Haven, Conn., start-up Rib-X Pharmaceuticals are riding on that hope. They have been using structure-based design in their efforts to discover novel ribosome-targeted antibiotics.
In the past decade or so, the ribosome has gone from being a biomolecule whose very structure was largely a mystery to one whose architecture is known at an atomic level and whose detailed workings are beginning to be better understood. Scientists have determined dozens of ribosome structures. They are conducting extensive mutational studies and are assessing the catalytic role of specific ribosome residues. They also have been carrying out theoretical modeling to aid understanding of the ribosome's detailed mechanism of action.
"The current model of [ribosomal] peptide bond formation is based on many different experiments, which sometimes did not seem to agree at first glance but little by little filled in the picture," says professor of physical biochemistry
Marina Rodnina of Witten/Herdecke University, Witten, Germany.
Many basic facts about ribosome-catalyzed protein synthesis have long been known. The ribosome reads mRNA's genomic message and translates it into protein. When the ribosome factory is open for business, an mRNA binds to its small subunit, and amino acids corresponding to the mRNA's sequence are delivered one by one to the ribosome by aminoacyl transfer RNAs (tRNAs). Each tRNA molecule carries an anticodon, a three-nucleotide code that corresponds to the amino acid it's carrying. These anticodons must be matched up with corresponding codons (complementary three-nucleotide codes on mRNA), to enable a protein chain to be built to order.
The ribosome complex has three tRNA binding sites-A (aminoacyl), P (peptidyl), and E (exit). When peptide bond formation occurs, the amine from a new amino acid on the tRNA bound at the A site attacks a carbonyl at the end of the growing peptide chain, which is attached to the tRNA bound at the P site. The reaction lengthens the peptide by one amino acid unit.

Improving Diagnosis Of Tropical Diseases (Part 2)

MICROFLUIDIC SYSTEMS are attractive for diagnostics in resource-limited settings, Weigl said, because they allow all three steps to be integrated in a single device that can be used by minimally trained personnel. "We can assume some training but not Ph.D. chemists," Yager said. Normally an instrument is required to drive the fluid through the microfluidic system, and sometimes for other functions such as heat cycling and detection, but manual methods also are available that allow microfluidics to be used without an instrument, Weigl said.
University of Washington
Box It Up The DxBox, a prototype diagnostic system for developing countries, combines a lab card for samples and a reader.
Yager leads a team that is developing a microfluidics-based diagnostic system called the DxBox. The name is a sly nod to Microsoft's Xbox game system, acknowledging funding from the
Bill & Melinda Gates Foundation through its Grand Challenges for Global Health initiative. The University of Washington team includes PATH and the diagnostics companies Micronics and Nanogen. Although U.S. companies are involved in the project, Yager doubts that all components of the final system will be manufactured in the U.S. "It must be produced at a cost that is appropriate for the end users," he said.
The researchers are focusing on a panel of fever-causing pathogens, such as those that cause dengue fever, measles, malaria, and typhoid, as the first application of the DxBox. To run the analysis, a blood sample is injected onto a disposable polymeric microfluidic card that is inserted into a reader. After the sample is injected, the microfluidic card takes over via computer control. The researchers will do nucleic acid assays and immunoassays on a single microfluidic card.
One type of immunoassay being pursued in the DxBox system uses a porous membrane that supports an antibody that captures antigens from the sample. The researchers then add labeled antibodies that can be detected by optical imaging of the membrane.
Weigl, who collaborates with Yager on other projects, described a different project the team is working on—a disposable microfluidic card to diagnose enteric (intestinal) pathogens. This project is slightly easier than the one to diagnose fever-causing pathogens, Weigl said, because of the higher levels of enteric pathogen in stool samples compared with the levels of fever-causing pathogens in blood.
The current state of the art in enteric pathogen identification is bacterial culture that takes one to four days and costs $200-$500 per sample. The goal is to reduce that to one day and $1.00-$5.00 per sample. With the microfluidic card, the entire process, from feces swab to polymerase chain reaction amplification of pathogen DNA to visual readout, takes less than 30 minutes, Weigl said. The researchers read the samples by integrating colored particles into the nucleic acid as it is amplified and then capturing the particles at specific places on the bottom of the microfluidic card, generating colored bands that can be read visually. The device required to read the microfluidic card is still fairly complex, he said, with a footprint the size of a laptop computer.
In another example of tropical disease detection,
Antje J. Baeumner, an associate professor of bioengineering at Cornell University, described work to analyze dengue virus, which is an RNA flavivirus. Dengue virus has four different serotypes that can cause three different diseases. Unfortunately, they don't provide cross-immunity, so someone who has been infected with one serotype doesn't have immunity against the others. Knowing the serotype is important for receiving proper treatment.
Baeumner applies a method called NASBA (for nucleic acid sequence-based amplification), which uses the enzymes reverse transcriptase, RNase H, and RNA polymerase to amplify single-stranded RNA from the different types of dengue virus. The amplified nucleic acid is then combined with two sets of DNA probes in a sandwich assay. One set of probes is immobilized to a solid support and pulls the target dengue virus sequences out of the sample. A second probe, which also binds the captured nucleic acid, is tagged to dye-containing liposomes. Lysing the liposomes releases the dye and increases the signal, thereby improving the sensitivity of the assay. Baeumner has demonstrated the detection method in lateral flow assays and microfluidic systems.
Baeumner is working with New York-based Innovative Biotechnologies International to commercialize the assay technology. The assay takes only 10 to 15 minutes. The team is working on an integrated microfluidic system that can do the entire analysis from sample preparation to final detection in 30 minutes.
EVEN BETTER than tests with simple instruments would be tests that require no instruments at all. Lee's team is developing a dipstick test that can visually detect nucleic acid targets via a capture probe and reporter molecule. If a target DNA sequence is present, a colored line appears on the dipstick. Lee lamented the slow development process. "It took two years to develop the visual chemistry and another two years to make it sensitive enough," she said. Lee, who worked at
Abbott Laboratories for 10 years, said that if she had been working on the project while in industry, "I would have been fired. In fact, I would have fired myself if it took that long."
These groups and others are taking steps toward affordable diagnostic methods for the developing world, but there's still a long way to go. "You never trip on boulders; you trip on pebbles," Lee said. "There are many pebbles," she said, along the way to developing simple, affordable diagnostic methods.

Improving Diagnosis Of Tropical Diseases (Part 1)

THE FIRST STEP in treating a disease is diagnosing it. For tropical diseases in developing countries, such diagnoses aren't as easy as they should be. Many of these countries lack the trained personnel or reliable infrastructure to support complicated analyses. Simplified methods and equipment could improve such diagnoses. A symposium at Pittcon, held last month in Chicago, focused on efforts to develop better, cheaper methods for diagnosing tropical diseases. The American Chemical Society's Division of Analytical Chemistry organized and sponsored the symposium.
Courtesy of Helen Lee
Donor testing Diagnostic testing of blood donors in Ghana takes place outside the laboratory.
Helen Lee of the department of hematology at the University of Cambridge presented surprising statistics about the availability of basic supplies and infrastructure in developing countries. In a survey of African facilities, a small yet significant fraction lacked even the most basic resources, such as a reliable source of electricity and tap water (17% and 7%, respectively), she said. The availability of other supplies was worse, and 40% of facilities do not have incinerators to deal with medical waste.
Beyond the availability of basic resources, cost is a big issue in developing countries. For example, Lee said, the blood banks in Kumasi, Ghana, have a total annual budget of approximately $70,000, with 15% allocated to testing and 17% to consumables. In light of such meager funds, Bernhard Weigl, the group leader for the diagnostics development team at the Seattle-based nonprofit organization
Program for Appropriate Technology in Health (PATH), said that his organization's goal is to lower the cost of each infectious disease test to no more than $1.50.
Infectious disease testing is important not only for diagnosing and treating patients but also for ensuring the safety of the emergency blood supply. African hospitals and clinics can't rely on a large volunteer donor pool like that available in many developed countries, Lee said. Approximately half of the donors are older family members with a higher prevalence of transfusion-transmitted viruses. In a study of more than 1,000 blood donors at a hospital in Ghana, nearly 20% were infected with one or more of HIV, hepatitis B, and hepatitis C, Lee said. Such findings demonstrate the magnitude of the challenge, because among developing nations "Ghana is a well-managed country with good infrastructure," she said.
The testing is often carried out under less than ideal circumstances. Lee described open-air pre-donation testing of volunteer blood donors recruited by FM radio in Kumasi. Access to refrigeration is often limited, so assays and reagents in tropical climates must be able to withstand temperatures that can soar past 35 oC (95 oF).
Given such challenges, scientists must make sure they are developing diagnostic tests that health care providers in developing countries want and need and, most important, can actually use. "The last thing" Western scientists "want to do is build something and find that no one wants to use it," said
Paul Yager, a professor of bioengineering at the University of Washington, Seattle. For his projects, his team has done an initial assessment of the needs in India, their first target country. They are working on similar assessments in Brazil and sub-Saharan Africa.
For example, Yager said, tests need to be fast enough for patients to receive treatment. If the analysis takes too much time, patients might leave without appropriate treatment. Each sample test should ideally be completed in 15 minutes, he noted.
Diagnosis of tropical infectious diseases generally involves two types of assays. Nucleic acid analyses measure the presence of the pathogen's genetic material—either DNA or RNA—in the patient; immunoassays measure the patient's antibody response to the pathogen. Which assay provides better diagnosis depends on the stage of the infection. At early stages, nucleic acid analysis is better, but at later stages, the immunoassay is preferable. Because health care workers don't know at what stage of infection they are testing, they ideally will do both assays.
Nucleic acid testing involves three unavoidable steps, Lee said: sample preparation, amplification, and detection. Of those three, sample prep is always the one that will "stump" practitioners, she said. "There's no sense in using a simple back end if you don't have a simple front end," she said.

24 March 2007

Transit of Venus

The 2004 transit of Venus
A transit of Venus across the Sun takes place when the planet Venus passes directly between the Sun and Earth, obscuring a small portion of the Sun's disk. During a transit, Venus can be seen from Earth as a small black disk moving across the face of the Sun. The duration of such transits is usually measured in hours (the transit of 2004 lasted six hours). A transit is similar to a solar eclipse by the Moon, but, although the diameter of Venus is almost 4 times that of the Moon, Venus appears much smaller because it is much farther away from Earth. Before the space age, observations of transits of Venus helped scientists using the parallax method to calculate the distance between the Sun and the Earth.
Transits of Venus are among the rarest of predictable astronomical phenomena and currently occur in a pattern that repeats every 243 years, with pairs of transits eight years apart separated by long gaps of 121.5 years and 105.5 years. Before
2004, the last pair of transits were in December 1874 and December 1882. The first of a pair of transits of Venus in the beginning of the 21st century took place on June 8, 2004 (see Transit of Venus, 2004) and the next will be on June 6, 2012 (see Transit of Venus, 2012). After 2012, the next transits of Venus will be in December 2117 and December 2125.[1]
A transit of Venus can be safely observed by taking the same precautions as when observing the partial phases of a solar eclipse. Staring at the brilliant disk of the Sun (the photosphere) with the unprotected eye can quickly cause serious and often permanent eye damage.[2]

Planetary Nebula

NGC 6543, the Cat's Eye Nebula
A planetary nebula is an astronomical object consisting of a glowing shell of gas and plasma formed by certain types of stars at the end of their lives. They are in fact unrelated to planets; the name originates from a supposed similarity in appearance to giant planets. They are a relatively short-lived phenomenon, lasting a few tens of thousands of years, compared to a typical stellar lifetime of several billion years. About 1,500 are known to exist in the Milky Way Galaxy.
Planetary nebulae are important objects in astronomy because they play a crucial role in the
chemical evolution of the galaxy, returning material to the interstellar medium which has been enriched in heavy elements and other products of nucleosynthesis (such as carbon, nitrogen, oxygen and calcium). In other galaxies, planetary nebulae may be the only objects observable enough to yield useful information about chemical abundances.
In recent years,
Hubble Space Telescope images have revealed many planetary nebulae to have extremely complex and varied morphologies. About a fifth are roughly spherical, but the majority are not spherically symmetric. The mechanisms which produce such a wide variety of shapes and features are not yet well understood, but binary central stars, stellar winds and magnetic fields may all play a role.

ATLAS Experiment

Jump to: navigation, search
The accelerator chainof the Large Hadron Collider (LHC)
LHC experiments
ATLAS
A Toroidal LHC ApparatuS
CMS
Compact Muon Solenoid
LHCb
LHC-beauty
ALICE
A Large Ion Collider Experiment
TOTEM
Total Cross Section, ElasticScattering and Diffraction Dissociation
LHCf
LHC-forward
LHC preaccelerators
p and Pb
Linear acceleratorfor proton and Lead
(not marked)
Proton Synchrotron Booster
PS
Proton Synchrotron
SPS
Super Proton Synchrotron
ATLAS (A Toroidal LHC ApparatuS) is one of the five particle detector experiments (ALICE, ATLAS, CMS, TOTEM, and LHCb) currently being constructed at the Large Hadron Collider, a new particle accelerator at CERN in Switzerland. When completed, ATLAS will be 46 metres long and 25 metres in diameter, and will weigh about 7,000 tonnes. The project involves roughly 2,000 scientists and engineers at 165 institutions in 35 countries.[1] The construction is scheduled to be completed in June, 2007. The experiment is designed to observe phenomena that involve highly massive particles which were not observable using earlier lower-energy accelerators and might shed light on new theories of particle physics beyond the Standard Model.
The ATLAS collaboration, the group of
physicists building the detector, was formed in 1992 when the proposed EAGLE (Experiment for Accurate Gamma, Lepton and Energy Measurements) and ASCOT (Apparatus with Super COnducting Toroids) collaborations merged their efforts into building a single, general-purpose particle detector for the Large Hadron Collider.[2] The design was a combination of those two previous designs, as well as the detector research and development that had been done for the Superconducting Supercollider. The ATLAS experiment was proposed in its current form in 1994, and officially funded by the CERN member countries beginning in 1995. Additional countries, universities, and laboratories joined in subsequent years, and further institutions and physicists continue to join the collaboration even today. The work of construction began at individual institutions, with detector components shipped to CERN and assembled in the ATLAS experimental pit beginning in 2003.
ATLAS is designed as a general-purpose detector. When the
proton beams produced by the Large Hadron Collider interact in the center of the detector, a variety of different particles with a broad range of energies may be produced. Rather than focusing on a particular physical process, ATLAS is designed to measure the broadest possible range of signals. This is intended to ensure that, whatever form any new physical processes or particles might take, ATLAS will be able to detect them and measure their properties. Experiments at earlier colliders, such as the Tevatron and Large Electron-Positron Collider, were designed based on a similar philosophy. However, the unique challenges of the Large Hadron Collider—its unprecedented energy and extremely high rate of collisions—require ATLAS to be larger and more complex than any detector ever built.

Scientific Method

The scientific method seeks to explain the complexities of nature in a replicable way, and to use these explanations to make useful predictions. It provides an objective process to find solutions to problems in a number of scientific and technological fields. Often scientists have a preference for one outcome over another, and scientists are conscientious that it is important that this preference does not bias their interpretation. A strict following of the scientific method attempts to minimize the influence of a scientist's bias on the outcome of an experiment. This can be achieved by correct experimental design, and a thorough peer review of the experimental results as well as conclusions of a study.

Scientists use
models to refer to a description or depiction of something, specifically one which can be used to make predictions that can be tested by experiment or observation. A hypothesis is a contention that has been neither well supported nor yet ruled out by experiment. A theory, in the context of science, is a logically self-consistent model or framework for describing the behavior of certain natural phenomena. A theory typically describes the behavior of much broader sets of phenomena than a hypothesis — commonly, a large number of hypotheses may be logically bound together by a single theory. A physical law or law of nature is a scientific generalization based on a sufficiently large number of empirical observations that it is taken as fully verified.
Scientists never claim absolute knowledge of nature or the behavior of the subject of the field of study. Certain scientific "facts" are linguistic (such as the fact that humans are mammals), but these are true only by definition, and they reflect only truths relative to agreed convention. These deductive facts may be absolute, but they only say something about human language and expression, but not about the external world. This part of science is like mathematics.
Another part of science is inductive, and attempts to say something about the external world which is not true by definition, but can be shown to be true in specific instances by experiment or observation. Unlike a mathematical proof, a scientific theory which makes statements about nature in an inductive way, is always open to
falsification, if new evidence is presented. Even the most basic and fundamental theories may turn out to be imperfect if new observations are inconsistent with them. Critical to this process is making every relevant aspect of research publicly available, which permits peer review of published results, and also allows ongoing review and repeating of experiments and observations by multiple researchers operating independently of one another. Only by fulfilling these expectations can it be determined how reliable the experimental results are for potential use by others.
Isaac Newton's Newtonian
law of gravitation is a famous example of an established law that was later found not to be universal - it does not hold in experiments involving motion at speeds close to the speed of light or in close proximity of strong gravitational fields. Outside these conditions, Newton's Laws remain an excellent model of motion and gravity. Since general relativity accounts for all the same phenomena that Newton's Laws do and more, general relativity is now regarded as a more comprehensive theory

Length of Saturn's Day Remains Unknown, But Now We Know Why We Don't Know

Strangely, astronomers don't know how long a day is on Saturn, because they can't get a firm footing on the problem given the giant planet's gaseous nature.
So they have long relied on radio measurements of the ringed planet's magnetic field to help estimate the length of the day. But that doesn't really work either, they realized, so estimates have remained loose. Now the scientists at least have a better handle on this aspect of the problem.
Geyser activity from Saturn's small moon
Enceladus weighs down the big planet's magnetic field so much that the field rotates more slowly than Saturn itself, new observations reveal. The moon is a mere 310 miles (500 kilometers) wide.
"No one could have predicted that the little moon Enceladus would have such an influence on the radio technique that has been used for years to determine the length of the Saturn day," said Don Gurnett of the University of Iowa.
Gurnett is the principal investigator on a radio and plasma wave science experiment on NASA's Cassini spacecraft. The idea has been to measure Saturn's rotation by taking its radio pulse. The technique works pretty well on the other giant planets.
But the new observations, reported online this week by the journal Science, show that the invisible magnetic field lines, which emanate from Saturn's poles and radiate out like a giant, skeletal pumpkin, slip in relation to the planet's rotation.
The slip owes to the collective weight of electrically charged particles that originate in Enceladus' remarkable
geysers of water vapor and ice. Particles in the geysers encircle Saturn and become electrically charged, forming a disk around the equator of hot gas called plasma.
Meanwhile, measurements
revealed last year that Saturn's day has gotten about six or eight minutes longer-now roughly 10 hours and 47 minutes-since the 1980s when measured by the Voyager missions. Nobody suspects the trend to continue forever (meaning the days would just get longer and longer at such a rapid rate), but they also don't know what's going on.
Either the geysers on Enceladus are more active now than in the '80s, the astronomers figure, or perhaps there are seasonal variations as Saturn orbits the
Sun, a year that takes more than 29 Earth-years to complete.
"One would predict that when the geysers are very active, the particles load down the magnetic field and increase the slippage of the plasma disk, thereby increasing the radio emission period even more," Gurnett said Thursday. "If the geysers are less active, there would be less of a load on the magnetic field, and therefore less slippage of the plasma disk, and a shorter period."
"The direct link between radio, magnetic field and deep planetary rotation has been taken for granted up to now," said Michele Dougherty, a researcher at Imperial College London and principal investigator on Cassini's magnetometer instrument. "Saturn is showing we need to think further." (from yahoo news)

A Nose for Nectar

As anyone who's sipped a smoothie knows, thick drinks are hard to suck through a straw. The orchid bee faces the same problem in trying to draw nectar up its long proboscis: According to a study published in the May issue of The American Naturalist, the insect consumes nectar five times more slowly than do lapping bees, such as honeybees. There's a reward for such plodding, though. The long-tubed tropical flowers the orchid bee visits offer up to 10 times more nectar than the flowers lapping bees feast on. And perhaps because the nectar sits too deep for other bee species to reach, orchid bees can afford to take their sweet time. (Photo: G. Dimijian)

Science And Social Concerns

A good understanding of science is important because it helps people to better utilize technology, which most humans interact with on a daily basis. This is especially significant in developed countries where advanced technology has become an important part of peoples' lives. Science education aims at increasing common knowledge about science and widening social awareness of scientific findings and issues. In developed countries, the process of learning science begins early in life for many people; school students start learning about science as soon as they acquire basic language skills and science is often an essential part of curriculum. Science education is also a very vibrant field of study and research. Learning science requires learning its language, which often differs from colloquial language. For example, the physical sciences heavily rely on mathematical jargon and Latin classification is pervasive in biological studies. The language used to communicate science is rife with terms pertaining to concepts, phenomena, and processes, which are initially alien to children.[citation needed]
Due to the growing economic value of technology and industrial research, the economy of any modern country depends on its state of science and technology. The governments of most developed and developing countries therefore dedicate a significant portion of their annual budget to scientific and technological research. Many countries have an official
science policy and many undertake large-scale scientific projects--so-called "big science". The practice of science by scientists has undergone remarkable changes in the past few centuries. Most scientific research is currently funded by government or corporate bodies. These relatively recent economic factors appear to increase the incentive for some to engage in fraud in reporting the results of scientific research [1],[2] often termed scientific misconduct. Occasional instances of verified scientific misconduct, however, are by no means solely modern occurrences. (see also: Junk science) In the United States, some have argued that with the politicization of science, funding for scientific research has suffered

Philosophy of Science

The philosophy of science seeks to understand the nature and justification of scientific knowledge and its ethical implications. It has proven difficult to provide a definitive account of the scientific method that can decisively serve to distinguish science from non-science. Thus there are legitimate arguments about exactly where the borders are. There is nonetheless a set of core precepts that have broad consensus among published philosophers of science and within the scientific community at large. (see: Problem of demarcation)
Science is reasoned-based analysis of
sensation upon our awareness. As such, the scientific method cannot deduce anything about the realm of reality that is beyond what is observable by existing or theoretical means. When a manifestation of our reality previously considered supernatural is understood in the terms of causes and consequences, it acquires a scientific explanation.
Resting on reason and logic, along with other guidelines such as
Occam's razor, which states a principle of parsimony, scientific theories are formulated and the most promising theory is selected after analysing the collected evidence. Some of the findings of science can be very counter-intuitive. Atomic theory, for example, implies that a granite boulder which appears a heavy, hard, solid, grey object is actually a combination of subatomic particles with none of these properties, moving very rapidly in space where the mass is concentrated in a very small fraction of the total volume. Many of humanity's preconceived notions about the workings of the universe have been challenged by new scientific discoveries. Quantum mechanics, particularly, examines phenomena that seem to defy our most basic postulates about causality and fundamental understanding of the world around us. Science is the branch of knowledge dealing with people and the understanding we have of our environment and how it works.
There are different schools of thought in the philosophy of scientific method.
Methodological naturalism maintains that scientific investigation must adhere to empirical study and independent verification as a process for properly developing and evaluating natural explanations for observable phenomena. Methodological naturalism, therefore, rejects supernatural explanations, arguments from authority and biased observational studies. Critical rationalism instead holds that unbiased observation is not possible and a demarcation between natural and supernatural explanations is arbitrary; it instead proposes falsifiability as the landmark of empirical theories and falsification as the universal empirical method. Critical rationalism argues for the primacy of science, but at the same time against its authority, by emphasizing its inherent fallibility. It proposes that science should be content with the rational elimination of errors in its theories, not in seeking for their verification (such as claiming certain or probable proof or disproof; both the proposal and falsification of a theory are only of methodological, conjectural, and tentative character in critical rationalism). Instrumentalism rejects the concept of truth and emphasizes merely the utility of theories as instruments for explaining and predicting phenomena.