Kevin Lepton

I am the writer, editor and publisher behind this future technology blog and I predict you will keep reading to see what is coming right around that metaphorical corner.

Oct 062015

Children can now enjoy fun activities like solving puzzles and coloring their favorite Disney characters with the Disney Color and Play app. The people behind the Disney Research Project are taking the art of coloring to another level by developing an app that will allow users, young and old to color using Augmented Reality (AR).

Coloring books are not only creative tools to develop art skills in children but are also great stress busters for adults. And with this augmented reality coloring book app from Disney, children from all ages are introduced to a whole new world of art and technological advancement.

Basically, the user simply has to place the tablet with the app over the image on the coloring book he or she is working on and the result will be a three-dimensional model of the image. What makes it different from the iOS mobile app Crayola Color Alive, launched in 2014, is that it lets you see the 3D image while you render the color. The app from Crayola, on the other hand, allows users to see the output only after the work is done.

Another twist worth looking forward to will be the textured 3D version of the image being colored. That is, whatever texture is derived from the coloring book, the app will create the same output on screen. Moreover, the object is animated and can be seen from different angles, allowing the user to see the 2D image transform into real-time.

With the added dimension on the screen output, you can have an idea how the image looks like from different perspectives. This is made possible with what is known as the virtual spring system.


How does it work?

With the application of augmented reality in this latest project of Disney’s creative team, the app that is being developed is able to detect and track the original 2D image on paper and convert it to a three-dimensional image in real time. As you color the image, you can also see the development of your work on your tablet, but it will feel like magic with the animated 3D image on the screen.

Augmented reality has been present for a while now and has been used in different applications. It has enhanced user-experience with the integration of contextual information with what is seen on a camera feed. Also, it has made gaming experience more realistic as it allows images to come alive.

This technology has been used in business applications as well. However, Disney was able to come up with something that stands out among other creations introduced by other companies, an app that does not take away the joy of using traditional coloring books but rather enhances user-experience with the use of technology.

With the number of activities that can be enjoyed with smart devices and gadgets, researchers from Disney say that the use of traditional coloring books might not as exciting as they used to be. But by being to turn 2D coloring book images into animated 3D models, this will not be a concern anymore.




Sep 152015

Unlike regular printing which almost all of us know how to do without having to scratch our eyes out, 3D design and printing is in a league of its own. Basically, it’s more tailored towards those who are more tech savvy and actually love creating three-dimensional objects. However, Madeline Gannon – a researcher and teacher at the Carnegie Mellon University School of Architecture and PhD candidate in Computational Design – wants to change that. She wants to unleash the designer that is hiding in all of us.

Gone are the days when 3D printers used to be luxury machines. Although there are still rather expensive models these days, technology has advanced so much that anyone with a few hundred dollars to spare can get their hands on a 3D printer. However, as Gannon notes, not everyone can just create original 3D objects.


Enter Tactum

In order to give creative power to ordinary 3D printer owners, Gannon developed a system called Tactum. It’s an innovative software system that gives users the ability to create their own designs for 3D printers by just touching a projected image.

Essentially, one would just have to rub, poke or use any other hand gestures on a projected image which will then become their 3D printed object. Through this process, people can instantly see their object change shape in response to the touches.

The first series of 3D objects Gannon designed made use of a surface that is very much accessible: the human body.

Together with a companion project called Reverb which helps convert user-created designs into printable meshes, Gannon has created bracelets and necklaces with wide-ranging designs, including smooth landscapes and intricate textures.


Further Uses

Tactum’s use in creating fashionable items is just the beginning. The system really proves itself when used in the creation of functional pieces like the custom watchband Gannon designed for a Motorola Moto 360 smartwatch.

Gannon plans to use Tactum for customizing prosthetics and other wearable medical devices. Can you just imagine how better it would be for patients to have a truly customized device? They can collaborate with doctors and a Tactum technician in real time, providing feedback regarding the fit and feel of the device.

Tactum has the potential to produce 3D objects at a rate that is much quicker and at even lower cost. As such, doctors and patients can continually adjust prosthetic limb as they see fit, and also giving the patient a certain degree of personal expression.


The Journey to Maker

Gannon’s path to Maker can be traced back to her trips to museums as a high school student. For her, the buildings interested her more than the exhibits themselves. She realized then that she wanted something to do with architecture.

However, during her last year of architecture school, she experienced the limits of the human-computer interface. In other words, the computer couldn’t produce the ideas from her head. As a result, Gannon plunged herself into computer science.

Fast forward to the present and Gannon now heads MADLAB.CC, a design collective that explores computational approaches to architecture craft and interaction all aimed at exploring the “edges of digital creativity.”


Aug 052015

Cognitive neuroscience advancements have increasingly engaged a wider audience. While simple paradigms and traditional experiments done in a controlled laboratory environment had significant contributions into the insights regarding brain function, it is observed that our discernment towards cognition would need similarly complex realistic environments. Taking accounts from an electroencephalography (EEG) based brain-computer-interface (BCI) experiment approved by the Research Ethics Board at Baycrest and the University of Toronto, let us dig deeper into how neuroscience and art can affect the brain.


The Experiment

The experiment focuses on exploring a person’s ability to rapidly learn controlling his brain states in a complex environment. Along with an art-exhibition related criterion, this objective guided the entire experimental design. Participants received several minutes of controlled neurofeedback tests in a period that is shorter than the typical neurofeedback training experiments. The hypothesis was: Neurofeedback effects can be detected early in training, and with a large sample size, sufficient statistical power would be revealed.

The study revealed that aesthetic sophistication and technological maturity of virtual reality, gaming and multi-media have positioned these platforms as suitable partners for neuroscience. Also, it found that EEG has expanded its use outside the lab through BCI technology and interventions in therapeutic neurofeedback, as well as through other products, such as wearable devices for self-optimization, self-monitoring and neurogaming. This means that neurofeedback protocols based on brain-computer interfaces present good promises for attention, learning and creativity.

It also found that BCI applications learning was enhanced if a person would learn how to modulate his brain activity in as little time as possible. Learning is somehow associated with structural and functional changes in the brain, and despite continuous re-organization on a synaptic scale, effects on a large scale required time to manifest. Also, sensory stimulation protocols have yielded persisting re-organization of coupling between distributed areas in the brain after stimulation. With regards to cognitive performance, individual neurofeedback training sessions were found to mediate significant changes.


Findings in Detail

It is found that there were interesting global patterns of correlation between brain data and variables in demographic, regardless of conditions. This was reached by folding all condition-specific relative spectral power (RSP) measurements and gathering data from each participant across all conditions together. For headsets, researchers considered their effects as nuisance variables.

For relaxation and concentration, neurofeedback also had significant effects on these states, depending on the conditions of the subjects. Participants were found learning to modulate their relative spectral power for relaxation and concentration. Based on general results on these aspects, it was hypothesized that early (yet subtle) changes in activities in the brain are associated with short neurofeedback training protocol and would be detected with a larger size of samples.



Both novel and confirmatory findings from the experiment have provided a necessary proof of concept for a novel neuroscience research framework. By combining brain-computer interfaces, art and performance, we can now ask questions of complex real-life social cognition that are not accessible in laboratory settings, otherwise. It is concluded that the traditional approach to performing mind studies would discount the central feature of our brain being intrinsically subjective. Now, this opens interesting new avenues for research on neuroscience considering sociability, complexity and individuality of the human mind.


Relevant External Link