Intel announces Thunderbolt 3 with USB-C connector, double the bandwidth

intel-thunderbolt

Adopting USB-C means that Thunderbolt 3 will feature a reversible connector, a welcome feature

Intel has unveiled its third generation Thunderbolt interface, shedding its loyal Mini DisplayPort connector in favor of the nascent USB-C format. Further to offering greater degrees of versatility when hooking up peripherals, Thunderbolt 3 beefs up bandwidth from 20 Gbps of the second generation to 40 Gbps and can pipe power to your devices at the same time.

Apple raised its share of eyebrows when it announced a new MacBook bearing only a single port earlier this year. This feature was met with a healthy amount of intrigue, or even scepticism, but hinted at a future where less might mean much more when it comes to plugging things into your machine. Intel’s announcement at the Computex conference in Taipei on Tuesday gives this vision a nice little nudge along, in essence promising a single standard capable of serving everybody from the casual user to professionals that deal in huge data transfers.

Adopting USB-C means that Thunderbolt 3 will feature a reversible connector. It can deliver two 4K displays running at 60 Hz and charge your device with up to 100 W of power transfer. It incorporates support for DisplayPort 1.2 and USB 3.1, meaning that existing USB-C cables can still be accommodated, though data transfer will top out at 10 Gbps.

These combine for a pretty impressive list of specs, but Thunderbolt 3’s most profound impact may prove to be an ushering in of widely useful single-port machines. With the relatively small USB-C connector that is finding newly compatible devices all the time, this could extend to ever-slimming phones and tablets to send tangled webs of cables the way of the compact disc.

Intel plans to begin shipping Thunderbolt 3 products before the end of this year.

References:http://www.gizmag.com/

Scientists come a step closer to “regrowing” limbs

decellularization-rat-limb-transplant

The decellularized rat forelimb is injected with muscle and vascular cells

Currently, recipients of arm or leg transplants need to take immunosuppressive drugs for the rest of their lives, in order to keep the donated parts from being rejected. If we could grow our own replacement limbs, however, that wouldn’t be necessary. And while we do already possess the progenitor cells needed to grow such parts, what’s been lacking is a method of assembling them into the form of the desired limb. Now, however, scientists have created a shortcut of sorts – they’ve stripped the cells from one rat’s forelimb and replaced them with live cells from another rat, creating a functioning limb that the second rat’s immune system won’t reject.

Led by Dr. Harald Ott, a team at the Massachusetts General Hospital started by perfusing the donor limb with a detergent that stripped away all of its living cells. After the cellular debris was removed, all that remained was the empty non-living extracellular matrix that formerly contained the cells.

As this task was in process, progenitor cells from the recipient rat were being being grown in culture to produce muscle and vascular cells.

Once the limb was stripped of its original cells, it was placed in a nutrient solution-filled bioreactor and injected with the lab-grown cells – the muscle cells went into the individual muscle sheath sections of the matrix, while the vascular cells went into the main artery. After five days of being in the reactor, electric stimulation was applied to help the muscles grow. Two weeks later, upon being removed from the reactor, the limb was found to have functioning muscle cells in the muscle fibers and live vascular cells in the blood vessel walls.decellularization-rat-limb-transplant-3

When the muscles were activated using electrical stimulation, they were found to have 80 percent the strength of a newborn rat’s forelimb muscles. Additionally, when the limb was transplanted onto the recipient rat, its blood vessels soon filled with blood and became part of the circulatory system.

Ott and his team are now looking at ways of regrowing other limb tissues such as bone, cartilage and connective tissue. The regrowth of nerves should hopefully happen on its own. “In clinical limb transplantation, nerves do grow back into the graft, enabling both motion and sensation, and we have learned that this process is largely guided by the nerve matrix within the graft,” he says. “We hope in future work to show that the same will apply to bioartificial grafts.”

The decellularization technique utilized by the Massachusetts scientists, incidentally, has previously been used to create transplantable mouse hearts and rat kidneys. However, this is the first time that it’s been used for something more complex than an organ. The scientists have also decellularized a baboon forearm, indicating that the procedure could work on primates.

References:http://www.gizmag.com/

Almost universal SERS sensor could change how we sniff out small things

sers

A new almost–universal SERS substrate could be the key to cheaper and easier sensors for drugs, explosives, or anything else (Credit: University of Buffalo)

Identifying fraudulent paintings based on electrochemical data, highlighting cancerous cells in a sea of healthy ones, and identifying different strains of bacteria in samples of food are all examples of surface-enhanced Raman spectroscopy (SERS), a sensor system that has only become more in-demand as our desire for precise, instantaneous information has increased. However, the technology has largely failed commercialization because the chips used are difficult and expensive to create, have limited uses for a particular known substance, and are consumed upon use. Researchers led by a team from the University of Buffalo (UB) aim to change nanoscale sensors with an almost-universal substrate that’s also low-cost, opening up more opportunities for powerful analysis of our environment.

Though SERS is complicated to understand on the surface, it forms a critical part of testing for explosives, identifying toxins in food, and other applications in public health and safety, medicine, and research.

The technique relies on the unique electromagnetic properties of chemical compounds when stimulated with varying wavelengths of laser light and interacting with a surface designed to enhance the response (the “surface” in the name SERS). Each unique compound has a distinct spectral fingerprint and thus an industry researcher can discern between compounds that are invisible to the human eye without having to rely on doping the sample with labeling chemicals or having to possess a large sample.

Yet, currently most surfaces or substrates available on a commercial chip are optimized for only one wavelength of light, meaning scientists working with multiple compounds may need several chips to identify all their samples. This also ignores the ability to identify anonymous samples, which by their nature would require testing on multiple substrates.

The research from the team from UB and Fudan University in China introduces a substrate with a broadband nanostructure that “traps” the wide range of light most often used in SERS analysis, between 450 and 1100 nm.

The surface is composed of a film of thin silver or aluminum acting as a mirror, and a dielectric silica or alumina layer which separates the “mirror” from a layer of randomly applied silver nanoparticles. This construction also avoids expensive lithographic construction techniques.

Researcher Nan Zhang summed up the importance of the design by comparing it to a skeleton key.

“Instead of needing all these different substrates to measure Raman signals excited by different wavelengths, you’ll eventually need just one,” says Zhang. “Just like a skeleton key that opens many doors.”

Perhaps soon these “keys” will be available for airport screening, counterfeit protection, chemical weapon detection and a host of many more purposes requiring flexible, cheap sensors.

“The applications of such a device are far-reaching,” said Kai Liu, a PhD candidate in electrical engineering at UB. “The ability to detect even smaller amounts of chemical and biological molecules could be helpful with biosensors that are used to detect cancer, Malaria, HIV and other illnesses.”

References:http://www.gizmag.com/

A new tool measures the distance between phonon collisions

anewtoolmeas

Today’s computer chips pack billions of tiny transistors onto a plate of silicon within the width of a fingernail. Each transistor, just tens of nanometers wide, acts as a switch that, in concert with others, carries out a computer’s computations. As dense forests of transistors signal back and forth, they give off heat—which can fry the electronics, if a chip gets too hot.
Manufacturers commonly apply a classical diffusion theory to gauge a transistor’s temperature rise in a computer chip. But now an experiment by MIT engineers suggests that this common theory doesn’t hold up at extremely small length scales. The group’s results indicate that the diffusion theory underestimates the temperature rise of nanoscale heat sources, such as a computer chip’s transistors. Such a miscalculation could affect the reliability and performance of chips and other microelectronic devices.
“We verified that when the heat source is very small, you cannot use the diffusion theory to calculate temperature rise of a device. Temperature rise is higher than diffusion prediction, and in microelectronics, you don’t want that to happen,” says Professor Gang Chen, head of the Department of Mechanical Engineering at MIT. “So this might change the way people think about how to model thermal problems in microelectronics.”
The group, including graduate student Lingping Zeng and Institute Professor Mildred Dresselhaus of MIT, Yongjie Hu of the University of California at Los Angeles, and Austin Minnich of Caltech, has published its results this week in the journal Nature Nanotechnology.
Phonon mean free path distribution

Chen and his colleagues came to their conclusion after devising an experiment to measure heat carriers’ “mean free path” distribution in a material. In semiconductors and dielectrics, heat typically flows in the form of phonons—wavelike particles that carry heat through a material and experience various scatterings during their propagation. A phonon’s mean free path is the distance a phonon can carry heat before colliding with another particle; the longer a phonon’s mean free path, the better it is able to carry, or conduct, heat.
As the mean free path can vary from phonon to phonon in a given material—from several nanometers to microns—the material exhibits a mean free path distribution, or range. Chen, the Carl Richard Soderberg Professor in Power Engineering at MIT, reasoned that measuring this distribution would provide a more detailed picture of a material’s heat-carrying capability, enabling researchers to engineer materials, for example, using nanostructures to limit the distance that phonons travel.
The group sought to establish a framework and tool to measure the mean free path distribution in a number of technologically interesting materials. There are two thermal transport regimes: diffusive regime and quasiballistic regime. The former returns the bulk thermal conductivity, which masks the important mean free path distribution. To study phonons’ mean free paths, the researchers realized they would need a small heat source compared with the phonon mean free path to access the quasiballistic regime, as larger heat sources would essentially mask individual phonons’ effects.
Creating nanoscale heat sources was a significant challenge: Lasers can only be focused to a spot the size of the light’s wavelength, about one micron—more than 10 times the length of the mean free path in some phonons. To concentrate the energy of laser light to an even finer area, the team patterned aluminum dots of various sizes, from tens of micrometers down to 30 nanometers, across the surface of silicon, silicon germanium alloy, gallium arsenide, gallium nitride, and sapphire. Each dot absorbs and concentrates a laser’s heat, which then flows through the underlying material as phonons.
In their experiments, Chen and his colleagues used microfabrication to vary the size of the aluminum dots, and measured the decay of a pulsed laser reflected from the material—an indirect measure of the heat propagation in the material. They found that as the size of the heat source becomes smaller, the temperature rise deviates from the diffusion theory.
They interpret that as the metal dots, which are heat sources, become smaller, phonons leaving the dots tend to become “ballistic,” shooting across the underlying material without scattering. In these cases, such phonons do not contribute much to a material’s thermal conductivity. But for much larger heat sources acting on the same material, phonons tend to collide with other phonons and scatter more often. In these cases, the diffusion theory that is currently in use becomes valid.
A detailed transport picture
For each material, the researchers plotted a distribution of mean free paths, reconstructed from the heater-size-dependent thermal conductivity of a material. Overall, they observed the anticipated new picture of heat conduction: While the common, classical diffusion theory is applicable to large heat sources, it fails for small heat sources. By varying the size of heat sources, Chen and his colleagues can map out how far phonons travel between collisions, and how much they contribute to heat conduction.
Zeng says that the group’s experimental setup can be used to better understand, and potentially tune, a material’s thermal conductivity. For example, if an engineer desires a material with certain thermal properties, the mean free path distribution could serve as a blueprint to design specific “scattering centers” within the material—locations that prompt phonon collisions, in turn scattering heat propagation, leading to reduced heat carrying ability. Although such effects are not desirable in keeping a computer chip cool, they are suitable in thermoelectric devices, which convert heat to electricity. For such applications, materials that are electrically conducting but thermally insulating are desired.
“The important thing is, we have a spectroscopy tool to measure the mean free path distribution, and that distribution is important for many technological applications,” Zeng says.

References:http://phys.org/