Skip navigation


1 ... 6 7 8 9 10 ... 18 Previous Next

Sweet Apps

257 Posts

In 1903, Orville Wright became the first person to pilot a powered airplane. The flight lasted a (probably terrifying) 12 seconds and covered 120 feet. Since then, airplanes have become much, much more sophisticated but they still require onboard human pilots. However, a revolution is underway: more and more research is being conducted on the feasibility of unmanned aircraft.


One great example: Adam Amos of Rescue Robotics is developing a high-performance autopilot system that can be rapidly reconfigured for Unmanned Aerial Vehicle (UAV) competitions around the world. He’s using LabVIEW 2012 and Multisim to program his system, and he chose a NI sbRIO-9606 device because it has a large FPGA and 400 MHz processor in a very lightweight package.


Picture of the UAV on its launcher with the ground station.


His plane has a wingspan of 1.8 meters and a maximum flight time of 25 minutes. How does he launch his aircraft, you ask? An elastic catapult. Watch the video of his plane in action below.



>> Learn more about NI Single-Board RIO.

>> Visit Adam’s blog for more photos, videos, and application details.


Fun facts about the Hawksbill sea turtle:

- They get their name from their sharp, curving beak.

- Their shells change color slightly based on water temperature.

- They primarily eat sea sponges but also occasionally eat jellyfish and anemones.


Not-so-fun facts about the Hawksbill sea turtle:

- They’re currently critically endangered, mostly due to human impact.


Fortunately, students at the School of Science and Technology, Singapore, are on the case! The Perhentian Islands, formerly a large nesting ground for green and hawksbill sea turtles, now contain a turtle hatchery to boost declining turtle populations. The students needed to build a hardware system to constantly measure the temperature of soil at different depths in order to determine if incubating eggs in pails is as viable as incubating eggs buried under the sand.



Baby sea turtle on the Perhentian Islands.

The students developed a multitemperature sensor-based data-logging system that can be buried at different depths to measure the temperature of soil over a 24-hour period. The temperature sensors are connected to a multiplexer, NI myDAQ hardware, and a computer running NI LabVIEW software. The information gathered from this system will be used to help keep sea turtles around for future generations to enjoy.


>> Learn more about NI myDAQ here.


Robots have proven useful for accomplishing all kinds of difficult tasks, from performing surgery to deboning chickens. Now, thanks to a group of engineering students from the University of Texas at Austin, they’re getting in on the fun and games too.


Using the NI LabVIEW Robotics Starter Kit, the student team designed a robot that could identify and accurately shoot up to five pool balls into pockets selected by the user. The students equipped the robot to identify the balls by using the NI LabVIEW Vision Development Module and a live camera feed. They also added cue actuation and rotation mechanisms to the starter kit, creating a robot that slides along the side of the pool table and lines up each shot.




Running on NI Single-Board RIO, the robot wirelessly connects to a laptop that controls the spring-loaded cue mechanism. Once the user selects a pocket, the robot makes its angle calculations, adjusts into position, and shoots right on cue.

>> Watch a video of the pool-playing robot in action.
>> Learn more about the NI LabVIEW Vision Development Module.


Remember when a wireless network connecting people all over the world was nothing more than science fiction? Or when the most cutting-edge Internet advancement meant not tying up a phone line while you surfed the web? Well, space agencies are taking wireless technology up a notch, and their latest Internet experiment is out of this world – literally.


NASA and the European Space Agency (ESA) recently tested a new communications protocol that could lead to an interplanetary Internet. At the International Space Station, commander Sunita Williams entered commands on her NASA-developed laptop. Hundreds of miles below, at an ESA facility in Darmstadt, Germany, a LEGO MINDSTORMS robot responded. Williams steered the robot from the space station, successfully controlling a vehicle on Earth’s surface as she orbited above the planet.




The LEGO MINDSTORMS robot, running NXT software powered by LabVIEW, was used to simulate a robotic rover on another planet. The experiment proved an astronaut can safely orbit above the planet and control a vehicle on the ground below.


So how exactly does this interplanetary communication work? NASA’s Disruption Tolerant Networking (DTN) protocol allows Internet-like communications to work despite long distances, time delays, and disconnections and errors by moving data through a network “hop-by-hop.” Williams used this experimental interplanetary Internet to drive the robot, and DTN may one day lead to a more connected solar system.


>> Find out more about LabVIEW for LEGO MINDSTORMS.


Whether it was in the early ’90s on a Super Nintendo or more recently on a Nintendo Wii, at some point in your life, you’ve probably played Mario Kart. The latest version of Mario Kart allows players to view the track from the perspective of the driver for the first time. But our engineers at Waterloo Labs topped this with their latest Sweet App: real-life Mario Kart.



From throwing banana peels and turtle shells to make your opponents’ cars careen out of control to absorbing the power of mushrooms and stars on the track to accelerate your vehicle, this real-life Mario Kart system has it all!


The team at Waterloo Labs used NI CompactRIO hardware to control all of the valves and servos in the system and to communicate with all of the other cars in the race. Using information received from RFID tags embedded in the items picked up on the track, the CompactRIO device determines which action to perform on the vehicle, such as braking, accelerating, or turning.


>> See another Waterloo Labs application: Karaoke on Fire.


When NI systems engineer Ben Black wants his favorite beer, he doesn’t run to the convenience store. He sets up his homebrewing equipment, adds ingredients like bourbon and oak chips, and creates his own porter in his backyard. But when Ben and his wife found out they were having a baby, he knew he wouldn’t have time to babysit his brewing system anymore. Instead of giving up on his hobby, he put LabVIEW and some recycled NI hardware to work to automate his system.


NI Single-Board RIO and a variety of C Series modules provided a flexible platform to monitor and control the temperature and heating elements, while LabVIEW tied the hardware together and powered the system. By adding LabVIEW Web services, Ben could even monitor his brew’s temperature from the couch.

The automated homebrewing system went public at NIWeek 2012 as one of the most popular demos on the expo floor. Watch the video below to see Ben explain how it all works.


>> Read the full interview with Ben on his homebrew system.
>> Learn more about LabVIEW Web services.


The day of the inaugural Formula 1 race in Austin, Texas, more than 117,000 people packed the stands at Circuit of the Americas to see the elite racecars in action. People from all over the world watched British driver Lewis Hamilton speed across the finish line and win the U.S. Grand Prix.


To be the fastest, each F1 driver relies on the unique design and capabilities of their car. Since each team is responsible for its own design, developing the fastest, most aerodynamic vehicle possible is an essential part of race preparation. Easier said than done: F1 cars are made up of around 80,000 components and everything must work together flawlessly. That’s where National Instruments comes in.


NI products like LabVIEW system design software helped several F1 teams take their designs from paper to the track faster by offering a reconfigurable platform to test systems throughout the design validation stage. In addition to testing electronic control units, F1 engineers used NI software and hardware to test the rigidity, drag, carbon fiber strength, safety, and durability of the cars. With the help of these tools, today’s F1 teams are developing some of the most advanced racecar technologies in the world.


>> Watch a video to see how F1 engineers design and test their cars.

>> Find out more about LabVIEW.


We’ve all seen it in spy movies – the intruder approaches a high-security facility, armed with all the tools needed to sneak in. A computer program hacks the password. A molded finger pad fools the fingerprint system. A high-quality photo tricks the iris scanner. But what if the system analyzed something even tougher to fake – veins?


Researchers at Universiti Sains Malaysia have developed an identification system to do just that. Using NI Single-Board RIO and LabVIEW, it images a vein beneath the skin’s surface and will only authenticate the finger of a living person.


fingervein id.png


Invisible, near-infrared light penetrates the finger, is absorbed by the hemoglobin of the blood in the vein, and appears as an infrared finger-vein image. The image is then adjusted using the image rotation VI in the LabVIEW Vision Development Module and analyzed with a finger-vein recognition algorithm created in LabVIEW. An automated feedback control system, implemented using an FPGA, adjusts the intensity of the light to ensure the best contrast. The system authenticates the image, comparing it to those of authorized people added during an enrollment period, with 98.93 percent accuracy.


>> Read how researchers are using body odor for personal identification.

>> Learn more about NI FPGA-based design.


Remember science class? You probably learned about reproduction among giggling, embarrassed middle schoolers as the teacher explained how sperm cells travel through the body. It turns out your biology textbook had barely scratched the surface, and until recently, so had modern science.


Because of the tiny size and high speed of human sperm cells, previous attempts to track their three-dimensional trajectories were limited. The sperm quickly swam out of the observation area of conventional lenses, so scientists had trouble collecting data on their 3D swimming patterns. By using a new lens-free technique based on LabVIEW, a group of researchers at UCLA tracked the natural trajectories of a large volume of sperm.

3D sperm setup.PNG


The researchers set up an imaging platform, controlled by LabVIEW, that used partially-coherent illumination at two different wavelengths to acquire holographic lens-free shadows of the sperm. They used this technique to track the swimming patterns of more than 1,500 individual sperm cells. Almost 93% of the cells followed the “typical” trajectory, swiftly moving forward with small side-to-side movements.


The lens-free setup allowed the researchers to collect data with submicron positioning accuracy from a volume of sperm cells much larger than allowed by conventional lenses. In addition to providing important information about human sperm patterns, the research may help scientists observe the 3D motion of other microorganisms and better understand the underlying biophysics in the future.


>> Download a free trial of LabVIEW.


Imagine going back in time to Italy in the 14th century. As the country recovered from the plague, Italians began cultivating the beginnings of the Renaissance. You would witness the rise of art, science, and, in 1386, the Duomo di Milano. This cathedral in Milan took five centuries to build, and the main spire wasn’t completed until 1762. The Duomo remains one of the largest Gothic structures in the world.


After 250 years of weathering and pollution, the spire needed a facelift, and LabVIEW and an NI hardware system turned out to be the right tools to get the job done.




Restoring the marble required crews to build a 90-ton freestanding scaffolding structure around the spire. In order to ensure the scaffolding didn’t touch the spire and to monitor the weight and wind conditions, the restoration organization called in a team from the Politecnico di Milano.

The team came up with an optical sensor system to monitor the position and movement of the scaffolding and spire. Optical fibers were routed to a PXI Express system in the bell room of the cathedral. The PXI system, powered by the LabVIEW Real-Time Module, acquired sensor data and constantly monitored the structure’s behavior. Any unexpected movement triggered an alarm at the supervision screen and sent an email to the project team members. The PXI system kept working through environmental conditions like lightning and earthquakes and gathered stable, accurate measurements over time.

>> Learn more about the NI PXI platform.


Did you know…

  • Honeybees beat their wings 11,400 times per minute, creating their distinctive “buzz”?
  • Bee pollination adds $15 billion in crop value in the U.S. alone?
  • Flowers look different to a bee than they do to a human?


Humans may not be able to sprout wings and buzz around like a honeybee, but thanks to a system developed at the University of Leeds, we can now see colors just like the insects do.




A group from the schools of geography and biology created an exhibit for London’s RHS Chelsea Flower Show that allowed visitors to see gardens through the eyes of a pollinator. The group used LabVIEW and the NI Vision Development Module to split, filter, and combine the spectral components of images from two streaming video cameras at the exhibit.


The system mimicked “bee vision” by shifting and combining the green, blue, and UV streams to create a bee’s view of the flowers, which was then displayed next to the human image from a normal RGB stream.


This shift in perspective drew visitors to interact with the exhibit, which focused on the garden’s role in ecosystem services like pollination. Isn’t that just the bee’s knees?


>> Learn more about the NI Vision Development Module.


Believe it or not, there are a few things humans can still do better than robots. With our critical thinking skills and emotional capacity, we’re holding off that whole robots-take-over-the-world thing for at least a few more years. However, we may soon have to remove “deboning a chicken” from the list of human-only skills.

For the last eight years, Gary McMurray and his team at the Georgia Institute of Technology have been working to create a robot that can make the precise cuts required to remove the meat from a chicken. While a human can see and feel each individual chicken, which has a different bone structure from the one before it, it’s more difficult for a robot to make the right cuts.



The robot McMurray’s team developed is a cut above other automatic deboning machines that make standardized slices and leave meat behind. The robot uses a 3D imaging system that guides a surgical blade to the precise location a human would make a cut.

So what’s powering this revolutionary robot? LabVIEW serves as the machine’s operating system, with FPGA technology collecting data and motion commands executed through NI CompactRIO every millisecond.

“For us, the real value of LabVIEW was the ability to focus on the controls system and modeling that was key to our research and not have to worry about how to write algorithms to do encoder sampling and decoding or a Butterworth filter,” said McMurray. “That allowed us to achieve our research goals and not waste time reinventing the wheel.”


>> Watch a video of the robot in action.

>> Check out another meat-related Sweet App: an automated barbecue powered by CompactRIO.


We use random numbers for many applications in life – from global climate prediction to air traffic control to the lottery. However, true randomness is hard to achieve. Why? Most random number generation is created by computer algorithms, meaning that it really isn’t random since the process is still controlled by something designed by a programmer.


Researchers at the Australian National University (ANU) have not only found a viable solution, but they have also developed the world’s fastest random number generator by listening to vacuum noise. No, not your household vacuum cleaner-- this generator uses a sensitive light detector to detect virtual sub-atomic particles that spontaneously appear and disappear in a vacuum.

 NI PXI.jpg

ANU researchers used NI technology, including NI LabVIEW system design software, an NI PXIe-1062Q real-time chassis, and an NI PXIe-7965R NI FlexRIO module, to build their Quantum Random Number Generator. Researchers use these tools primarily for data acquisition and processing.

“We chose NI products for their versatility and ease of programming – LabVIEW proved easier to use than FPGA programming languages like VHDL or Verilog,” said Dr. Thomas Symul, research fellow at ANU. “We were already using NI products in our Quantum Optics Lab,” Professor Ping Koy Lam adds, “which is further proof of the versatility of the technology.”



The random number generator is available online for anyone to use. In fact, it has already received several million hits. Now everyone has access to truly random numbers!


>> Read the full paper on the the ANU Quantum Random Number Generator.



If you haven’t heard, Austrian skydiver Felix Baumgartner casually dove off a platform 24 miles above the ground yesterday. Over 8 million viewers watched live on YouTube as “Fearless Felix” became the first human to break the sound barrier with his body. Baumgartner and his team spent many years preparing for the record-smashing jump— and the technology they used to ensure his safety included NI LabVIEW software.



Of course, Baumgartner’s survival wasn’t guaranteed. In fact, the live feed was delayed 20 seconds in case of an accident. If he managed to avoid an unstoppable spin (he narrowly did), his life depended on the integrity of his pressure suit. At 102,800 feet, the temperatures plummet to 70 degrees below zero Fahrenheit and the atmosphere is so thin that his blood would have vaporized if his equipment failed.


Testing his pressurized jump suit and helmet was a larger goal of the mission. His suit, equipped with sensors and recorders, measured everything from his speed to his heart rate. Back at mission control, his team used LabVIEW to monitor various I/O like altitude, pressure, and oxygen levels. In the future, such equipment could save an astronaut's life if a spacecraft malfunctions.


The Red Bull Stratos jump is proof that with proper testing and the right technologies, not even the sky is the limit for human accomplishment. Congratulations to Baumgartner and his entire team!


>> Check out our first Sweet App on Red Bull Stratos.


If old cell phones are filling up your junk drawer at home or that iPhone 5 upgrade is looking too good to pass up, we have good news! Thanks to ecoATM, you can recycle your used electronic devices in just a few minutes at your local shopping mall. Oh, and did we mention you get cash in return?



The ecoATM kiosks use LabVIEW to identify and evaluate devices through Neural ID’s Concurrent Universal Recognition Engine (CURE). The CURE Library for LabVIEW builds Knowledge Maps to recognize object characteristics. By referencing a library of 4,000 electronics ranging from smart phones to MP3 players, the machine identifies the device, tests its condition, and determines a payout value.

It’s green too! Approximately 75% of devices collected at ecoATM kiosks are reused, and the other 25% are responsibly recycled, keeping electronics out of landfills. EcoATMs can already be found in about 150 locations in 10 U.S. states, with more popping up every day.

>> Find an ecoATM location near you.


>> Learn more about Neural ID's CURE Library for LabVIEW on the LabVIEW Tools Network.

1 ... 6 7 8 9 10 ... 18 Previous Next

Filter Blog