Knovel – Integrate with your Engineering tools

The future of biofuel may rely on animal fats

November 20, 2014

Biofuel has been a popular topic for a number of years now, with many innovators looking to products like corn oil for the future of fuel. However, for some people, the notion that vegetable oil or animal fats could one day be used to fuel America's motor vehicles is not only absurd, but potentially dangerous from an economic standpoint. Corn, for instance, accounts for a huge portion of the American diet. If corn crops had to be split between food and biofuel production, it is conceivable that the price of this commodity could skyrocket.

Running on gators 
In an effort to identify new sources of renewable biofuel that would not compete with food supplies, some researchers have been exploring the potential of using animal fat that would otherwise be discarded during the butchering process. Specifically, scientists have been analyzing the potential of alligator fat as a base for biofuels.

In 2008, 700 million gallons of biodiesel came from soybean oil, according to the American Chemical Society. That number included oil that was recycled from deep fryers in fast-food restaurants as well as sewage, in addition to new oil that was produced from soybeans grown for this purpose. In the same year, the meat industry disposed of 15 million pounds of alligator fat in landfills. If that fat was reallocated for biofuels, it could significantly offset the amount of soybeans used for this purpose.

Kentucky fried fuel
Still, some skeptics are not convinced of the feasibility of biofuel as a new source of energy for motor vehicles. These purists claim that it is simply not as good as fossil fuels, but one professor from Middle Tennessee State University is on a quest to prove them wrong. Cliff Ricketts,  Ph.D, recently embarked on a cross-country journey from Key West, Florida, to Seattle, Washington, and back to Murfreesboro, Tennessee, in a 1981 Volkswagen Rabbit fueled solely by biodiesel. The fuel is a blend of animal fat and waste vegetable oil that was collected from the MTSU dining facilities.

It's important to note that it's not possible to simply empty a deep fryer into a diesel tank. The oil first needs to be refined by a process called transesterification, which converts fats and removes the glycerin from the oil while combining fatty acids with alcohol. The result is a combustible, drop-in fuel that can be burned in just about any diesel engine. If the journey is a success, it will demonstrate the viability of this technology.

Engineers enable printing of electronic circuits

Whether they know it or not, consumers across the globe use electronic circuits in a variety of ways many days of their lives. For instance, these small objects are present in headphones and speakers, as they play a pivotal role in turning digital information into sound, while they also help identify RFID tags, which help track goods being sold across innumerable industries.

These circuits are extremely complex, and as such are difficult to create by even the most seasoned veterans using the latest engineering information technology.

That may no longer be the case, as researchers at Nanyang Technological University Singapore recently discovered a new way to print flexible electronic circuits.

Incorporating new materials
A release from the school detailed that this was made possible by printing the complex circuits out in layers atop materials like paper, plastic and aluminum foil using nanoparticles, carbon and plastic. This strategy is "additive," meaning that it's eco-friendly and doesn't employ toxic chemicals or oxidizing agents.

The new method can enable the printing of the objects in mere minutes, and the tactic is also highly scalable in terms of the size of the circuits. 

Potentialities for the future
As NTU Singapore associate professor Joseph Chang noted in an official statement, this can have a lot of positive implications on the future of research within various industries, including potential mass production of electronic circuits.

"This means we can have smarter products, such as a carton that tells you exactly when the milk expires, a bandage that prompts you when it is time for a redressing and smart patches that can monitor life signals like your heart rate," Chang explained.

NTU as an innovator
The college has a long history of innovation in the engineering realm. More recently, according to Channel NewsAsia, the university held a public symposium to discuss attracting more women to scientific fields, including the engineering sector.

"We now have fewer women in the important fields of engineering, science and technology, and we are missing out all that potential and half of the best brains," college President Bertil Andersson noted, in a statement.

On Nov. 7, the college organized the "Women in Engineering, Science and Technology Symposium" to catch the attention of interested females from Singapore, as well as other nations. The event featured speeches from a number of bright minds in the aforementioned fields, such as Professor Ada Yonath, the 2009 winner of the Nobel Prize for Chemistry, and Professor Daniela Rhodes, a member of th U.K. Royal Society. A number of other educators working at NTU also gave presentations.

New method to measure stress in 3-D printed materials tested

November 20, 2014

Across the globe, 3-D printing seems to be one of the most innovate, interesting technologies being developed now. Many individuals are imagining a future in which they can, from the comfort of their own homes, print out things like food, clothing, toys, tools, spare parts and other items that they'd normally have to leave their home to obtain.

However, much more goes into this process. First, consumers need to buy the printer and the filaments that will make up the materials of the goods created, and then they have to download a blueprint to enable creation.

Interestingly, it's not strictly a consumer-focused technology. Doctors are already using these machines to print out medical necessities, such as stents, grafts and other implants, while architects are also beginning to use the printers to craft materials. This presents a new learning scenario for professionals in various industries.

For instance, some scientists and researchers are considering how 3-D printed objects will be categorized and detailed within engineering database platforms. These can be great repositories for mechanical engineers, among others, who need to know more about materials.

As the interest and use of 3-D printed parts expand, one element in particular is becoming prominent in engineering research - the residual stress that 3-D printed metal parts can endure. Professionals in the sector recently discovered a new means of measuring this metric, something that can change the face of the engineering industry in the near future.

Why is this important?
3-D printed materials are being used more and more frequently, for things like machine parts and structures. As such, professionals need to verify the durability and strength of such materials to ensure the safety of all individuals who come into contact with the objects.

Moreover, these parts aren't strictly solid metal, Science Daily pointed out. Rather, 3-D metals are printed out layer by layer, and each level's sprayed-on metal powder particles are fused to the next with the help of a high-energy laser beam within the printer. 

Researchers leverage additive manufacturing
According to the source, scientists at the Lawrence Livermore National Laboratory made the discovery that they could measure stress on these types of parts by using a process called powder-bed fusion additive manufacturing.

Residual stress is concerning to those involved in the printing process because there are significant temperature shifts, particularly near the last melt spot. Heating and cooling causes expansion and contraction, potentially putting the strength of the final product in question, especially for thicker pieces.

The news provider also pointed out that warping, detachments and total failure might also be possible unless durability can be verified.

This is where powder-bed fusion additive manufacturing, as well as digital image correlation, comes into play, the source detailed. Using DIC, an image analysis method, researchers at the lab found that they could collect stress data by comparing readouts of what the material looks like both before and after it's on the build plate. Should there be no distortion, that means that there is no residual stress.

According to Science Daily, the lab compared its results with that of the Los Alamos National Laboratory, which uses the proven method of neutron diffraction. Only three facilities have the capability to perform these tests because of the unavailability of high-energy neutron sources. The LLNL's findings matched those of LANL, proving the reliability of this new method.

The future of 3-D printing
This could have massive implications for the future of the use of 3-D printed materials used for structures. 

However, the industry continues to move to even larger, unexplored realms. Spaceflight Now explained that the first 3-D printer for use in outer space has been activated by astronauts onboard the International Space Station. This raises more questions that engineers may be interested in - durability might hold up under a lot of stress on Earth, but what about in a zero gravity situation?

"We figured out all the tricks of making a 3-D printer work in zero gravity - everything from how to make the mechanics different - each layer is fractions of a micron from one layer to the next," Jason Dunn, co-founder and chief technology officer at printer creator Made in Space, told the source. "In zero gravity, you have things floating around, and if they float within those fractions of a micron, it throws off the whole print."

Another thing that had to be considered before it could be used in space was the potential to heat up the materials enough to enable the molding and creation of objects.

The news provider noted that it's been smooth sailing thus far. The printer arrived at the space station Sept. 23 and was unpacked and set up by Butch Wilmore in the Destiny laboratory module. This particular machine doesn't enable metal objects to be created - it currently uses a special type of plastic, acrylonitrile butadiene styrene, which is also used to make Lego bricks. 

Universities show their commitment to engineering research

November 18, 2014

Engineers across the globe have a wide breadth of resources at their fingertips when they need to reference anything from material density to the latest engineering research revelations. They can access databases, read research briefs and so on.

However, this interest in the field generally starts somewhere - professional engineers have to go through adequate schooling before they can earn that honor. Many colleges across the globe offer worthwhile engineering programs to undergraduate and graduate students so they can enrich their lives and pave a solid path to the future.

The fact that this career route is becoming more and more popular and profitable is evidenced by the numerous universities pledging their commitment to engineering.

Michigan State's facility to open in 2015
According to the Lansing State Journal, Michigan State University is among the most notable colleges leading this charge. The facility invested in a new biomedical sciences and human health research facility, called the MSU Bio Engineering Facility, which is set to open in summer 2015. Moreover, the source stated, the building itself is innovative in that it has an open-floor design to promote collaboration and boasts energy-conservation elements that should help the school save money over time.

The building is set to be the headquarters for a number of concentrations, from nanotechnology to robotics to health care.

"The student demand in engineering has really never been higher. We now have something like 6,000 students in the college. It has gone up considerably over the past five or six years," MSU College of Engineering Dean Leo Kempel told the Journal. "The college is growing and will require us to increase faculty to meet that need, and look hard at our use of space."

The Lansing Regional Chamber of Commerce recently honored the college for its $61 million investment, the source noted.

Cornell gets involved in big competition
Not to be outdone, Cornell University engineers have been very busy lately, according to the Cornell Chronicle. The school paper detailed that some students recently joined the DARPA Robotics Challenge, which is a multi-year competition open to the international community. 

Team ViGIR, which stands for Virginia-Germany Interdisciplinary Robotics and includes professionals from the University of Hanover and Oregon State University, hopes to take home the top prize of $2 million in June 2015. The aim is to create a robot that can help humans respond to natural or man-made disasters, the source reported. Cornell engineers have been charged with enabling either partial or full autonomy to the robotic platform, the news provider explained, thereby automating tasks and allowing for robotic decision-making.

The team named the robot Florian, and it will compete in California in 2015, though it cannot take the top spot unless it successfully completes 10 tasks in an hour.

Texas college becomes part of consortium
In an effort to open up the field for more innovation for students and faculty, the University of North Texas College of Engineers recently entered the Cold-Formed Steel Research Consortium. This way, research can be more comprehensive and students in various locations can collaborate.

"Joining the consortium gives UNT and partner institutions an opportunity to combine our expertise and research facilities to advance research and knowledge on cold-formed steel," noted associate professor Cheng Yu, as quoted by Phys.org.

The college joins other schools with an engineering concentration, including the University of Massachusetts-Amherst, Johns Hopkins University, McGill University and Northeastern University.

Individuals will have more access to information about cold-formed steel, the source reported, which is used in the building of various types of equipment, such as bridges, car bodies and transmission towers.

A cost analysis of mechanical engineering salaries and tuition

November 15, 2014

After years of recession, college grads are finally in a position to start taking advantage of a decent job market. Specifically, the position of mechanical engineer is better than it has been in years. According to U.S. News & World Report, the average salary for a mechanical engineer in the Unites States has increased by over $20,000 since 2004. Moreover, since this is one of the broadest engineering disciplines, according to the Department of Labor, there are many employment opportunities for budding mechanical engineers as they enter the workforce.

Slow growth, but improvement nonetheless
In fact, the Department of Labor reports that in 2012, there were 258,100 jobs available for mechanical engineers in the U.S. Looking toward the future, the department also speculated that job outlook from 2012 to 2022 sits at about 5 percent, which is slower than average, but still positive considering how many jobs are still in the red following the recession. This means that while new graduates will definitely find the job market to be competitive, they are still in a good position to find gainful employment with their engineering degrees.

Jobs in mechanical engineering
Mechanical engineers work in a wide range of industries in both public and private sectors. Basically, in any field that requires the development, building or testing of mechanical devices like tools, engines and machines, you're going to find mechanical engineers. They work in product development, but they also participate in strategy and maintenance. Some mechanical engineers operate out of professional office settings, while others are based in laboratories or even directly at work sites. The Department of Labor indicated that most mechanical engineers work in engineering services, research and development or manufacturing.

This is the perfect career path for anyone who enjoys working conceptually with mechanical systems. Mechanical engineers design and develop these systems, but they also create maintenance routines and optimize machine workflows. This is a career path that requires an astute attention to detail and a solid grasp of advanced mathematical and scientific fields such as physics, thermodynamics and aerodynamics. But even with all of the education required to do the job, most mechanical engineers operate with a terminal bachelor's degree, making it more cost-efficient than many other career paths.

The cost of education
Like any field, fledgling mechanical engineers can expect to pay a little over $40,000 per year for their degree at a private institution. For example, U.S. News & World Report reported that tuition at the Massachusetts Institute of Technology, which has one of the highest ranked mechanical engineering programs in the world, let alone the country, costs $43,210 per year. Similarly, Stanford comes in a bit higher at $45,480, and both Carnegie Mellon and the California Institute of Technology weigh in around $40,000. This means that, on average, mechanical engineering students can expect to pay $42,170 per year to attend a private university, which adds up to $168,680 over four years of schooling. There are more affordable options however. Students who choose to go to their state universities can expect to pay closer to $12,000 per year for tuition. Over four years, that's only about $48,000, which is a much more affordable estimation.

Nonetheless, mechanical engineering students who accrue $168,680 of debt from school will still find themselves in a tenable position once they land a job. With average salaries for mechanical engineers coming in around $80,000, most professionals are able to keep up with student loan payments without issue. According to the Department of Education, most graduates can expect to pay approximately 15 percent of their income to student loans until they are paid off. This would mean that new mechanical engineers repay about $12,000 per year, which would mean that without any assistance or debt forgiveness, they could zero their debt within 14 years. That might sound cumbersome, but compared to professions that necessitate graduate degrees that require two to six additional years of school, mechanical engineers are at a distinct advantage in paying off student loans in a timely fashion.

Advanced degrees
Although it is not strictly required, mechanical engineers who choose to advance their education beyond undergraduate classes can expect to collect a larger pay check. According to PayScale, mechanical engineers with a bachelor's degree pull in somewhere between $46,976 and $86,500 per year. With a master's degree, that range increases to $60,043 to $108,050. Interestingly, for engineers who are primarily interested in the bottom line, there is actually a slight drop in the top end of salaries for those who complete doctorate programs. Mechanical engineers with a Ph.D. make anywhere from $62,400 to $104,456, which means that the top end of the spectrum is actually less than that of mechanical engineers only holding a master's degree. This discrepancy could account for the fact that many of those individuals holding doctorates work in academia, where pay is commonly lower than the private sector.

All things considered, the average mechanical engineering salary is sizable, and for those who enjoy working with machines, this is the the perfect field.

Fracking research finds little fault in fluids

November 12, 2014

Over the past few years, the oil and gas industry has been drawing heat for its use of controversial fracking practices. Hydraulic fracturing is a method of extracting natural gas from shale rock layers that are situated deep under the Earth's crust. These layers of shale was previously inaccessible, even though it contains a massive amount of untapped natural gas. Recently, fracking has made it possible to access this gas by injecting highly pressurized fluids into the shale area. The new channels formed the the drilling process allow for gas to escape at a much higher rate than it otherwise could. Although fracking can produce some impressive yields of natural gas, many people are concerned about the potential environmental impact this practice could have.

Risk of toxicity 
Most discussions of fracking have focused on the use of hydraulic fracking fluids, which are composed of a wide spectrum of different chemicals. Fracking may have led to a boom in American energy, but it is also reported to have led to the pollution of well water and drill sites. The problem with this practice is that there is no way to reclaim the fluid once it has been used. This means that it often leaks into ground water and aquifers in the proximity of the drilling site. According to The American Chemical Society, approximately 33 percent of the 200 commonly used components of fracking fluid have not been sufficiently tested for health risks. More troubling is the fact that eight of the 200 are known to be toxic to mammals.

Nonetheless, the fracking industry insists that the chemicals used to make the fluid are commonly used in the food industry, and the rest of them are so diluted by the time they get into the ground water that they might as well not be there at all. In order to get to the bottom of this debate, William Stringfellow, director of the ecological engineering research program at the University of the Pacific in Stockton, California set out to evaluate the potential health risks posed by the chemicals in hydraulic fracking fluid.

Databases and research
The first step for Stringfellow and his team was to pour over databases and reports to come up with a complete list of substances that are most frequently used in fracking. This list included all kinds of materials for different tasks, such as gelling agents that thicken the fluids, biocides that kill microbes and sand that makes it easier for the fluid to crack open shale. While analyzing these materials, the researchers came to the realization that both sides were right, to a degree. Stringfellow found that the fluid was in fact made of mostly nontoxic and food-grade materials, but that doesn't necessarily mean that the materials can easily or safely be disposed of. By way of an example, Stringfellow equated fracking to pouring a truckload of food waste down a storm drain.

If these compounds are to be left to filter into groundwater and aquifers, they need to be treated first. This is because the raw fluids have proven to be toxic to aquatic life. And that doesn't mean that the other materials are entirely benign - additional research needs to be done to determine the effect that the other materials have on ecosystems.

Testing the water
While Stringfellow and his team took a broad, research-based approach, researchers from the University of Colorado at Boulder were tasked with identifying the toxicity of a specific class of chemicals used in fracking fluid: surfactant. To reduce the surface tension between water and oil, surfactants are used to produce higher yields of oil and gas when fracking. In order to evaluate the potential toxicity of this material, the researchers took samples from five fracking sites in Colorado. Their analysis indicated that the chemicals used in surfactants are commonly found in everyday products like toothpaste, laxatives, detergent and even ice cream.

However, this does not mean that all fracking is 100 percent safe. As Stringfellow demonstrated, many chemicals in these products, if untreated, can still have a negative effect on the environment. Moreover, while the five well sites in question came back negative for toxicity, not all fracking operations use exactly the same chemicals in their fluid.

Looking to the future
At the end of the day, all of this research indicates that additional testing will still be required to definitively say whether or not fracking is bad for the environment. The chemicals in toothpaste and ice cream have obviously been tested extensively on humans just through use. However, there are many other plants and animals that may have adverse reactions when exposed to these chemicals. For instance, while the compounds used for biocides may be harmless to humans, they could have a more devastating effect on local wildlife than initial reports seem to suggest.

Looking forward, it's important to recognize that fracking is a double-edged sword - it may produce impressive results, but the risk of unknown health and environmental hazards may prove to outweigh the benefits.

Russian scientists bring laser-propelled spacecraft closer to reality

October, 31 2014

Laser-spewing thrusters are as iconic as any science fiction visual motif, but scientists of the Ioffe Physical Technical Institute may have brought this technology off the silver screen and into reality. Their research, published in Applied Optics, demonstrates how a combination of lasers and rocket propellant greatly improves the efficiency of rockets and may send aerospace design into entirely new directions, according to The Optical Society. Culminating over a decade's worth of research, the latest breakthrough in supersonic rocket design is set to help mankind extend its reach further beyond the atmosphere.

A decade in design
Yuri Rezunkov, physicist and co-author of the project, explained to Motherboard that a laser propulsion design would provide a "sufficient decrease in mass of onboard propellant that a vehicle to be launched has to have" compared to the amount necessary in traditional aerospace design. The idea is novel and has been the subject of experiments as far back as the turn of the century. Trey Myrabo, a professor at Rensselaer Polytechnic Institute, pioneered a new form of laser-powered propulsion in 1999, according to NASA. He and his students were able to use a laser light to superheat air beneath a 5-meter spacecraft with seating for four. The superheated air produced a jet exhaust that propelled the craft into the air.

Taking advantage of plasma plumes
Myrabo's design for a laser-propelled rocket, as well as subsequent attempts to refine the design, has been limited by instabilities in thrust production. Rezunkov and his team at the Ioffe Institute aimed to resolve this performance flaw through a technique called laser ablation, said Motherboard. By firing laser pulses at the propellant stored inside a rocket, physicists were able to induce a phenomenon called a plasma plume. Superheating the propellant with laser pulses superheats the fuel into a jet of charged particles powerful enough to send rockets farther and faster with significantly less fuel. Even more impressive is the fact that the technique can be applied remotely, removing the need for a laser device to weigh down the space craft.

Ablation was key to resolving the issue of unpredictable thrust. Rezunkov and his colleagues adjusted the laser ablation process so that the plasma plume jets produced a more predictable plume along the inner walls of the ship's nozzle. The result was a far more efficient liftoff than using a traditional rocket weighed down by excess fuel. Unfortunately, the technique has been stalled for lack of appropriate equipment. A combination supersonic shock tube and a high-power CW laser would be necessary to generate additional results for their research.

Taking care of space debris
Rezunkov has big plans for laser-propelled rockets as the technique begins to mature, according to Motherload. His pet project is a mini-vehicle designed with a laser propulsion system that would could be used to remove debris and other floating objects from sections of outer space directly outside of Earth's atmosphere. Presumably, such technology would become increasingly critical as mankind begins to develop cheaper, more efficient spacecraft. Laser-propelled mini vehicles would have other uses beyond clearing garbage from the planet's doorstep. The cost-effective space crafts could be used to monitor space stations and celestial bodies that travel close enough to the planet.

Hybrid rockets using lasers and traditional propellant have come a long way in 15 years, and laser ablation techniques have massive potential to take rocket design to the next level. Rezunkov noted in Applied Optics that the technology was suited for launching both small satellites and propelling aircrafts up to Mach 10.

Robot-assisted surgeries create new possibilities and challenges

October 29, 2014

Robotic-assisted surgical systems, also known as robotic surgery, are devices that support a physician's efforts to perform surgery with high-tech software and refined movement controls, according to the U.S. Food and Drug Administration. Robotic surgery is helpful in a wide range of procedures, from non-invasive routines to complex, time-consuming surgeries. The actual technology is more akin to computer-assisted tools than actual robots - the robotic-assisted surgical systems do not move or activate on their own. The technology has been praised by doctors as an innovation that can make surgeries easier on doctors and patients. However, a well-rounded view of the technology reveals that robotic surgery is still in its developmental stage.

Robots take the burden off surgeons
The Heart and Stroke Foundation of Canada noted that one robotic-assisted technique that has made great strides is coronary artery bypass grafting. Canadian doctors found that robotically assisted CABG procedures can help limit post-surgery problems and quicken recovery time. The technique is less invasive than traditional bypass surgeries and helps to reduce pressure on the surgeon.

Bypass surgeries require surgeons to remove a section of the patient's vein and then use the section to restore a blockage preventing the flow of oxygen-carrying red blood cells to the heart. The operation requires a still wrist, and doctors are finding that the steadiest hands belong to the robots. Surgeons can reduce the difficulty of bypass surgeries by working from a surgeon console, a computer that displays 3-D images of the patient's organs. Surgeons "operate" on the 3-D image while multiple robotic arms mimic the doctor's movements and perform surgical incisions. The physician sits at a console, operating the robot with his or her fingers while focusing on the model of the patient. The medical industry has a lot to gain by finding new ways to remove human error from the caretaking process.

Using the robot provides doctors with a better view of the patient and allows them to make more precise cuts than ever before. This attention to detail helps patients by shortening their recovery time and reducing their blood loss during surgery. Less bleeding translates to lower demand for replacement blood. In this way, robotic-assisted surgeries can help the entire medical community by slowing the consumption of resources. Robotic systems ease the physical demands placed on the doctor as well. Surgeons must no longer contend with the anxiety that a slight touch could harm or even kill a patient. The costs recovered thanks to robotic systems could then be used to improve the hospital and support fellow physicians. 

Living with cybernetic caveats
Unfortunately, robotic-assisted surgical systems are not perfect for every procedure. In fact, robotic systems can actually create additional complications during some operations. For instance, The Wall Street Journal pointed out that robotic systems proved to be inept at performing gynecologic surgeries. A study from Columbia University showed that 15 percent of ovary removal procedures were performed with robotic assistance in 2012, an increase from 3.5 percent in 2009. Complications in the removal of ovarian cysts also skyrocketed over the same period, up 10 percent between 2009 and 2012. Likewise, the rate of complications when performing robotic assisted gynecologic surgery (3.4 percent) is considerably higher than non-robotic surgeries (2.4 percent).

Robotic surgery is also very expensive, and the high price point for devices prevents numerous health facilities from seriously considering an investment in robotics-assisted surgery. The system requires further development, but these issues should not overshadow the method's potential. If robotic surgery can consistently improve patient quality of life, then the technology is worth the cost.

Advanced fiber optics will help to enhance ocean research

October 28, 2014

The average person typically thinks of high-speed Internet and cable channel bundles when they hear the phrase "fiber-optic cables". However, this groundbreaking medium for transporting energy and information has far more important applications than streaming seasons of "Game of Thrones." Optical fibers are particularly useful as sensors, particularly because as passive sensors they do not require an electrical power source to function. The same unique light-refracting properties of optical fibers ideal for carrying television and Internet signals also make the material perfect for collecting information. Breakthroughts in optical fiber technology could help researchers to learn more about several oceanic environments.

Enter optical microfibers
Researchers from the Charles Fabry Laboratory and the French National Centre for Scientific Research have discovered unique light diffusion behavior present in optical fibers with a diameter of 1 micrometer, according to the center's homepage. This diffusion behavior can also be manipulated based on the microfiber's environment, so scientists are confident the material will become key to next-generation optical sensors. Microfibers are able to detect surface acoustic waves, as well as high-frequency soundwaves inaudible to the human year. These properties make optical microfibers ideal for new types of scanning equipment, and may even translate to properties useful to the defense industry.

The fibers were produced at the Charles Fabry Laboratory by narrowing standard silica fibers through an intense heating and stretching process, said the CNRS. The end results were thinner than average microfibers with the potential to advance sensor technology. Scientists at the CNRS took over at this point, experimenting with the new fiber by shining a laser beam down its length. The researchers subsequently observed a new type of light behavior, specific to subwavelength-diameter fiber, that takes place when light travels through microfibers. Researching these new behaviors provides clues to how light and sound impact each other at the smallest scale.

Research by the sea
By upgrading optic fibers with microfibers, scientist have an opportunity to refine their findings and produce more accurate results. Advances in microfiber production will make the most immediate impact on projects where optic fibers are already in use, such as those performed by ocean researchers. Smaller fibers with greater sensitivity could be used to give scientists additional clues about ocean life and collect details about organisms living in low-light conditions. Microfibers would enhance environmental research as well, and help to scientists to better anticipate the subtle natural phenomena that act as precursors to weather events like hurricanes and earthquakes.

One example of such a research project is currently underway at the the CNRS - a collaboration undertaken with the University of Auckland and the Tampere University of Technology. The team is utilizing optical fibers to locate rogue waves before they have a chance to gain strength and begin capsizing ships. Light and ocean waves share several similarities in their behavior, and these parallels make optical cables ideal for analyzing how rogue waves form and occurs. Shared behaviors between light and water also make it easier to model the behavior of ocean waves in the lab, or to test how a ship might hold up to bombardment by an enormous deluge.

Analyzing icebergs
Optic fibers are as useful for analyzing rogue waves as they are at collecting data from massive icebergs. As a result, multiple reams of fiber-optic cable have been rolled out into Antarctica, according to Science Magazine. Scientists brought fiber optic to South Pole in order to collect detailed information on the ice's temperatures. Data is taken by drilling a large hole over 200 meters into the ice and feeding a fiber optic cable into the frigid ocean water below. The light endured miniscule changes as it was exposed to increasingly cold water, and this measuring of small adjustments in light behavior after shining a light through the cord will gave researchers more clues about the exact nature of the briny deep.

Integration of the CNRS' research and new fiber optic cable is also ideal in terms of cost-effectiveness. Drilling 200 meters below the surface of the ice is extremely difficult and expensive, so scientists can save time and energy by drilling very thin holes. Extremely thin optic fiber cables can be inserted into these openings with ease, allowing researchers to save time on drilling. The use of optical microfibers to research hard-to-reach regions will likely increase as the usefulness of the technique gains notice and brings down the cost of fiber optics. For instance, one research trick unique to optical fibers is distributed temperature sensing, according to Science Magazine. Scientists can learn quite a bit about the ocean's climate, and the shape of its ice shelves, by paying attention to how temperatures change along a length of fiber as it sinks further into the ocean. 

Advances in biocomputers could lead to the end of cancer

October 27, 2014 

The interface between man and machine has always been a bit strained. However, these two entities, once thought to be entirely different from one another, are now growing closer thanks to innovations in biotech. The largest challenge with interfacing human beings with machines has been the fact that computers are made from non-organic parts, which are difficult to embed within a person's body. In the last century, scientists were able to create titanium medical implants that had a high degree of biocompatibility, which made them less likely to be rejected by patients' bodies. However, the holy grail of implants has been to create them out of pure biological materials. Recently, researchers from the American Technion Society and ETH Zurich have brought science closer to this goal by creating the building blocks for entirely biological computers.

Prototypes and proofs of concept
About a year ago, scientists from the Technion Institute of Technology developed and constructed an advanced biological transducer using nothing but biomolecules like DNA and enzymes. The computing machine is able to manipulate genetic codes and use the output to create new input for new computations. Importantly, there is no interface required when using this computer since all of the components - including hardware, software, input and output - are all constructed of biological materials, which use chemicals instead of binary code to relay information.

Ehud Keinan, Ph.D., lead researcher and faculty at Technion's department of chemistry, said, "In addition to enhanced computation power, this DNA-based transducer offers multiple benefits, including the ability to read and transform genetic information, miniaturization to the molecular scale, and the aptitude to produce computational results that interact directly with living organisms." 

The press release indicated that the transducer could be used on genetic material to detect and quantify specific sequences and to change and algorithmically process genetic code. This is because, as Keinan explained, all biological systems including living organisms are essentially molecular computers. Inside these systems, molecules communicate with one another in a logical manner, using chemical signals to talk instead of computer languages.

Refining the device
The computing device that Keinan and his team produced last year is operational, but not as precise or as diverse in terms of functionality as they would like it to be. Recently, a group of researchers from ETH Zurich has been working on more precise, plug-and-play circuit​ board components that can be assembled in different configurations for various applications. The goal is to create modular biological components that can be used in human bodies in the same way that software engineers operate in the digital sector.

According to ETH Zurich, the prototype biological computer created at Technion differs significantly from its silicon-based counterparts. For example, silicon chips are much more agile than biological computers. Since they communicate in ones and zeroes with electric current, silicon chips can oscillate very quickly between signal (one) and no signal (zero.) Biological computers, however, are much less precise. The signals that they send are based on chemicals, not current, so there are states of sending a signal and not sending a signal, but there are also intermediary states of sending what ETH called "a little bit of signal." This is a problem for biocomputers that serve as sensors for specific biomolecules. When they try to send a relevant signal, the meaning can sometimes be obscured or lost in the noise of those intermediary signal states.

ETH reported that early biocomputer components have also suffered from being "leaky." This means that they will occasionally send an output signal even if an input signal is not present. This problem is amplified further by the addition of extra components, which is necessary if scientists are to be able to create more complex biological machines that can accomplish specific tasks.

Genetic valves control leaking 
In order to create a biosensor that does not leak and is closer to the precision of a silicon chip, ETH scientists have been working on a new biological circuit that monitors and controls the activity of individual sensors with an internal timer. The circuit is able to stop sensors from self-activating when the system does not require it. Then, when the system requires that specific component, the circuit can activate it using a control signal.

ETH explains that the reason this is possible is because these biological sensors are made of synthetic genes that are interpreted by enzymes and converted into organic RNA and proteins. Nicolas Lapique, a doctoral candidate at ETH, created this new biocircuit so that the gene responsible for the output signal is not active when it is in a resting state. This is accomplished by effectively misaligning the gene, so there is no output. To activate this gene, a special enzyme called a recombinase is introduced. The recombinase extracts the gene from the circuit DNA and repositions it so it's output is aligned with the rest of the system, thus activating it.

To put this in  simpler turns, you can imagine the output gene as a garden hose attached to a bucket, which represents the biological computer. When a chemical signal is produced, it's like pouring water into the bucket. That water then flows out of the hose, or output gene. However, once the majority of the water is spent, the hose will continue to drip, sending those intermediary not on, not off states of chemical information. Lapique's circuit acts as a valve that misaligns the output so no residual water can drip out of the bucket. This essentially eliminates the intermediary states by starting and stopping output genes through alignment and misalignment.

Future applications
Already, researchers have tested the effectiveness of this new circuit and activation-ready sensor in samples of human kidney and cancer cells. Early stages have proven the effectiveness of the concept, but results are still minimal. Eventually, researchers hope that they will be able to use these biological computer systems to detect and kill cancer cells. The theory is that the bio-computers will be able to identify specific cancer molecules. If cancer is detected within a specific cell, the circuit would activate a cellular suicide program inside the affected cell. The cancer would then self-destruct, and healthy cells with no cancer markers would not be affected.

Although there is still much work to be done on biocomputers, research like that conducted by Technion and ETH has demonstrated valuable proof of concept, and additional study will surely lead to amazing results in the coming years.