A California-based start-up called NanoLambda have developed a low-cost ($10) spectrometer sensor chip called Apollo which makes possible a wide range of new sensing applications. The sensor is capable of measuring individual wavelengths of light and is accurate to 1 nm with 10 nm resolution.
Optical spectroscopy is a very powerful non-invasive diagnostic technique and has been used for decades in many fields including health care. However the equipment cost using traditional methods of spectrum analysis has limited its area of application to professional use only. Using the company’s nano fusion technology NanoLambda have fabricated a sensor from thin nanofilter arrays which reduces the size and cost to about 1 % compared with traditional sensing equipment. The sensor’s sensitivity bandwidth can be defined in the manufacturing process and even allows detection in the infra-red region. The small physical size of the sensor makes it ideal for use for unobtrusive, wearable health monitoring applications.
Spectrometer-on-a-Chip - [Link]
by Stanford University @ phys.org. For decades, the mantra of electronics has been smaller, faster, cheaper. Today, Stanford engineers add a fourth word – taller.
At a conference in San Francisco, a Stanford team will reveal how to build high-rise chips that could leapfrog the performance of the single-story logic and memory chips on today’s circuit cards.
Those circuit cards are like busy cities in which logic chips compute and memory chips store data. But when the computer gets busy, the wires connecting logic and memory can get jammed.
The Stanford approach would end these jams by building layers of logic atop layers of memory to create a tightly interconnected high-rise chip. Many thousands of nanoscale electronic “elevators” would move data between the layers much faster, using less electricity, than the bottle-neck prone wires connecting single-story logic and memory chips today.
Researchers combine logic, memory to build a ‘high-rise’ chip - [Link]
Amy Norcross @ edn.com:
HRL Laboratories, based in Malibu, CA, recently tested a prototype neuromorphic chip with 576 silicon neurons aboard a tiny drone measuring 6×6×1.5 inches and weighing 93 grams. The project was funded by the Defense Advanced Research Projects Agency (DARPA).
The drone, custom built for the test by AeroVironment of Monrovia, CA, flew between three separate rooms. The aircraft was able to process data from its optical, ultrasound, and infrared sensors and recognize when it was in a new or familiar room.
Smart chip mimics human brain functions - [Link]
by Suzanne Deffree @ edn.com:
Intel announced its 4004 processor and its chipset through an ad in Electronic News on November 15, 1971, making them the first complete CPU on one chip and the first commercially available microprocessor.
The building-block 4004 CPU held 2300 transistors. The microprocessor, the size of a little fingernail, delivered the same computing power as the first electronic computer built in 1946, which, in contrast, filled a room. Full technical details for the 4004 can be found in this January 1972 EDN story on the technology: One-Chip CPU available for low-cost dedicated computers.
Intel 4004 is announced, November 15, 1971 - [Link]
Silicon Labs have introduced a tiny 3 mm square hybrid TV tuner chip which supports reception of all worldwide terrestrial and cable TV transmission standards. According to its preliminary data sheet its design eliminates the need for an external balun, LNAs, SAW filters, and inductive power supply filtering. Some competing TV tuner solutions also eliminate the balun but can suffer from degraded NF and second-order distortion, which compromises reception. A fully-integrated 1.8 V LDO power supply regulator enables single supply operation, while a dual supply option offers additional system flexibility. Increased immunity to LTE interference and a harmonic rejection mixer filters out Wi-Fi interference and eliminates the need for external filtering.
Tiny hybrid TV Tuner - [Link]
by Rob Matheson @ phys.org:
Stream video on your smartphone, or use its GPS for an hour or two, and you’ll probably see the battery drain significantly. As data rates climb and smartphones adopt more power-hungry features, battery life has become a concern. Now a technology developed by MIT spinout Eta Devices could help a phone’s battery last perhaps twice as long, and help to conserve energy in cell towers.
Beating battery drain: Power-conserving chip may increase smartphone battery life - [Link]
One thing IBM emphasizes about its neurosynaptic chip is that it works like the “right” brain, which means intuition and jumping to conclusions, whereas the “left” brain works more like a traditional computer: R. Colin Johnson
IBM Builds World’s Biggest Brain-Chip - [Link]
Adding to their ever growing family of power supply regulators Linear Technology have introduced the LTC3807 step-down switching regulator DC/DC controller driving an all N-channel external synchronous power MOSFET stage. The chip uses a constant frequency current mode architecture allowing a phase-lockable frequency of up to 750 kHz.
The chip draws just 50 μA no-load quiescent current and an OPTI-LOOP compensation allows the transient response to be optimized over a wide range of output capacitance and ESR values. The LTC3807 features a precision 0.8 V reference and power-good output indicator.
Low-loss Step-down Regulator - [Link]
David Szondy @ gizmag.com writes:
If it weren’t for the microchip, your smartphone would be size of a building and need its own power plant to work. Thanks to the integrated circuit and its modern incarnation in the microchip, electronics are a bit easier to carry around than that, and this week, Christie’s put one of the very first integrated circuits up for auction. Designed and constructed in 1958 by Texas Instruments, it’s one of the three earliest “chips” ever made and went on the block with an estimated value of up to US$2 million.
One of the world’s first integrated circuits goes up for auction - [Link]
by Matt Mcgowan @ phys.org:
Engineering researchers at the University of Arkansas have designed integrated circuits that can survive at temperatures greater than 350 degrees Celsius – or roughly 660 degrees Fahrenheit. Their work, funded by the National Science Foundation, will improve the functioning of processors, drivers, controllers and other analog and digital circuits used in power electronics, automobiles and aerospace equipment – all of which must perform at high and often extreme temperatures.
“This ruggedness allows these circuits to be placed in locations where standard silicon-based parts can’t survive,” said Alan Mantooth, Distinguished Professor. “The circuit blocks we designed contributed to superior performance of signal processing, controllers and driver circuitry. We are extremely excited about the results so far.”
Circuits capable of functioning at temperatures greater than 650 degrees fahrenheit - [Link]