Kevin Lepton

I am the writer, editor and publisher behind this future technology blog and I predict you will keep reading to see what is coming right around that metaphorical corner.

Apr 172015
 

On April 9, 2015, the United States Department of Energy announced that it will push forward with the granting of an award that will further promote the U.S.’s leadership in exascale computing. Under the Collaboration of Oak Ridge, Argonne, and Lawrence Livermore (CORAL) initiative, the DOE will invest $200 million to develop a new supercomputer and have it installed in the Argonne Leadership Computing Facility at Argonne National Laboratory.

The next-generation computing machine, which is called “Aurora”, will be manufactured by supercomputer giant Cray. The company has become known for its lucrative government and commercial contracts in the past years and its development of supercomputers with world-renowned chip maker Intel.

Aurora is remarkable not just because of the amount of money that’s being put to it, but also because of its impressive specifications and its breath-taking potential. As one of the most powerful pre-exascale supercomputers to be ever created, it’s expected to reach a peak performance of 180 petaflops and become one of the fastest and most powerful computing machines ever made.

If this doesn’t seem too impressive, it’s important to take note that average modern computers — depending on the hardware — are capable of achieving up to 2,600 gigaflops. Some of the world’s current supercomputers, namely the Sequoia machine at the National Nuclear Security Administration and the Titan at the Oak Ridge National Laboratory, have a peak performance of 20 petaflops and 27 petaflops, respectively.

Aurora will be built with Intel’s third-generation processor named Knights Hill. Even though it’s still in development and Intel has not said much about it, this system framework is expected to have breakthrough performance, be highly compatible with a massive range of applications and prove itself to be more power-efficient than today’s supercomputers. It’s also highly scalable and adaptable and can therefore pave the way to new scientific discoveries that will have a global impact.

The Aurora contract is the third and final pre-exascale class system that has been funded for the CORAL initiative. Earlier, the Department of Energy announced that it’s investing around $325 million to develop state-of-the-art supercomputers for two other laboratories. Oak Ridge National Laboratory is set to receive Summit (which theoretically can reach 150 to 300 peak petaflops) in 2017. In the same year, Lawrence Livermore National Laboratory will get its supercomputer named Sierra, which will have a peak performance of 100 petaflops.

Aurora is scheduled to be installed at Argonne Leadership Computing Facility in 2018. But, prior to that, Intel and Argonne will collaborate to come up with an interim system (which will be named Theta) in 2016. This system will help the ACLF community in transitioning their programs and applications to the new technology and ensuring they’ll retain all important data when Aurora will be rolled around.

Argonne National Laboratory’s supercomputer will mainly be used to boost the performance of computing applications that are valuable to the Department of Energy as well as other agencies. It will also be available to everyone in the scientific community to attract the country’s best researchers to Argonne and help in developing other industries like materials science, biological science, renewable energy and transportation efficiency.

 

Mar 262015
 

graphene

Scientists have always known of the existence of graphene. After all, we all have used pencils once in our lives and the result of drawing using the writing device is the substance in question. Basically, it’s this one atom thick crystal that is one million times thinner than human hair and 200 times stronger than steel. The problem was: no one knew how to extract it out of graphite.

This is where two Russian-born scientists come in. Andre Geim and Konstantin Novosolev are researchers at the University of Manchester and in one of their Friday night experiments – sessions they hold not linked to their job to maintain interest in their field and generate new ideas – they accidentally created graphene with the help of Scotch tape.

The pair wrote a three-page paper describing what they had just discovered. It was rejected by Nature – twice – but eventually got published in the journal Science in 2004.

Since then, researchers all over the world have devoted time to studying this fantastic material that is as pliable as rubber and can stretch to 120% of its length. They also found that the material is a good conductor of heat and electricity.

Six years after Geim and Novosolev published their paper, they were awarded the 2010 Nobel Prize in Physics. As a result, the material they were able to create was lauded as “a wonder material” and one that “could change the world.” Researchers from various fields – medicine, chemistry, physics, electrical engineering – all come together to study this groundbreaking material.

As a result, the number of graphene-related patents have risen. The UK Intellectual Property Office alone reports of a jump from 3,018 in 2011 to 8,416 at the beginning of 2013. Samsung and Sungkyunkwan University in Korea, Zheijiang University in China and IBM in the US are the leaders in patent applications.

The possibilities are endless when it comes to products that can be developed using graphene: bendable computer screens, long-life batteries, very fast microcomputers, etc. Although the possibilities were endless, the amount of time it took to make the substance was lengthy and the temperatures too high. This is the area that Caltech staff scientist David Boyd was able to address with yet another accidental discovery.

Boyd wasn’t having luck with creating graphene by exposing methane to a heated copper surface. He got distracted by a phone call leading him to leave the copper on heat for a longer time than usual. Once he got back, he found that graphene was formed due to the added heat that removed a key impurity.

Basically, what used to take about 10 hours and a very high temperature to do can now be accomplished in around five minutes and at a lower temperature.

The discovery just opens of worlds of possibilities when it comes to graphene-based products. As Boyd told Pasadena Star-News, “You could imagine something crazy. You could wrap a building in graphene to keep it from falling over.

 

External Resource

http://www.huffingtonpost.com/2015/03/19/better-graphene-making-process-breakthrough_n_6891226.html

 

Feb 232015
 

Biohacking DNA

Biohacking is the term used in conjunction with your biological process and the integration of hacker principles. If you think of biology as a computer, and the way a hacker can infiltrate the system to make it work the way they want it to, biohacking can be easier to understand. The process of biohacking can involve a distinct combination of medical, nutritional and electronic methods to make the body function exactly as you want. Typically, those who engage in biohacking support the theory of “Transhumanism”, which states that fundamental altering of the human condition is possible through technology, which leads to a more advanced specimen of humans.

Benefits of Biohacking

For those who struggle with mental health disabilities, research suggests that biohacking can have a great impact on treatment. One of the goals of many biohackers is to boost serotonin and dopamine, which are the common neurotransmitters that increase good feelings. Biohacking can increase long term memory and productivity, and is said to have positive effects for both the mind and body.

In a scientific term, biohacking further advances the understanding of the body and its processes, with very little input from the medical community. Many biohackers feel that they can break free from the bars of traditional science and medicine, and make great progression toward being in charge of our own biology.

Many biohackers focus on cognitive health, balance of neurotransmitters in the brain, a positive productive mindset, and quick results. In theory, any progress toward improvement of the human mind and body can only help further a better population.

Disadvantage of Biohacking

The biggest issue many take with the idea of biohacking deals with the ethics involved. Because the field of biohacking is unregulated, many wonder if biohackers have the potential to create carcinogenic and pathologically detrimental organisms, either intentionally or unintentionally. The field has spawned a new discussion of biosecurity, which seeks to discover and manage the risks of biohacking to society. A specific code of ethics was created in 2011, by DIYbio for which biohackers should adhere, however there is no distinct rule requiring them to do so.

Additionally, biohacking is a relatively new theory, and while biohackers are quick to claim positive long term benefits, the longevity of the methods and techniques has yet to be truly tested. Biohacking is generally a free-for-all in terms of who can become a biohacker and what they can do with it.

Why it’s important

Biohacking can change the future of how we deal with our bodies. It’s an important new science which needs to be studied and have close attention by the public. In a regulated environment, biohacking can give humans a new perspective on their biology and really made tremendous advances in treating several conditions, like mental health and addiction. However, it’s not something that should be taken lightly, because the risks are real. Here’s to seeing what biohacking has to bring in the near future!

 

References – External Links

http://en.wikipedia.org/wiki/Biohacking

http://diy-bio.com/