For app developers, more big changes are coming soon

The App Store revolutionized the tech world when it opened in summer 2008, spawning a billion-dollar industry in one fell swoop. It was neither the first nor the largest back then, but the store quickly exploded in popularity, prompting Apple co-founder Steve Jobs to say “it is going to be very hard for others to catch up.”

The store was one of the big stars this week at Apple’s annual Worldwide Developers Conference in San Francisco, with CEO Tim Cook’s announcement that it had recently “passed a major milestone, with 100 billion app downloads” since the store opened its virtual doors.
Many in the app-developer fold say the business, thanks to the marketplace created by the App Store and other outlets like Google Play, is still in its infancy and mobile apps in the next few years will continue to change human behavior in unimaginable ways.
From the “Internet of Things,” where apps will help connect an estimated 50 billion devices to the Internet by 2020 and transform the way we relate to our homes and workplaces, to the continued democratization of software, where tech novices will be able to build their own apps, the digital landscape will shift at a breakneck speed.
At the same time, the way apps come into being could also go through a seismic shift. The small independent app-makers who early on helped make the App Store the success it is today will find it harder to survive there, while large corporations will dominate the stage as their in-house coders custom-tailor more and more apps to meet their customers’ needs.
“We’re seeing big companies taking over” the mobile-app industry, said Mark Wilcox, a business analyst with VisionMobile. As a result, according to the firm’s recent survey, nearly half of the developers who want to make money building apps actually make zero or next to nothing. “Large companies, and especially game publishers, take all the top spots on the App Store and most of the revenue,” he said. “The little guys are struggling to compete.”
Calling the momentum “absolutely staggering,” Cook told developers this week that the App Store has forever changed the way we think of software and the way we all increasingly use it in our daily lives.
Connecting everyday objects, from home-heating systems to toasters, will continue to be a major focus for developers, with one survey showing that 53 percent of respondents said they were already working on so-called IoT – or “Internet of Things” – apps. Wearable tech, like the new Apple Watch, could host thousands of new apps this year alone, from health and fitness monitors to tools not yet envisioned.
In a clear nod to the future of apps already unfolding on wearable technology, Cook used his keynote address to introduce Kevin Lynch, Apple’s vice president for technology, to talk about watchOS 2, the first major update for the Apple Watch since it was unveiled last September. Lynch said developers could soon use the new software to build native, or in-watch, apps that would allow users to tap directly into the watch’s burgeoning bounty without having to rely on their iPhones for access.
Another budding trend features strategically placed beacons, small devices in the physical world that interact with apps, which in turn will collect and process mountains of data. An in-app sale offer triggered on your phone by a beacon inside the Wal-Mart you just entered is an example of this technology. Over time, all that data collected from our phones about our daily patterns will then guide and improve the software we’ll use to work and play.
“The ‘Internet of Things’ is happening quickly,” said 22-year-old Ashu Desai, whose Make School is teaching college and high-school students how to build apps. “We’ll see apps where your phone will know more and more about your surroundings. There will be a massive proliferation of sensors that will be everywhere so apps can send you the temperature of your hot tub, lock and unlock your doors, and turn on your stove remotely.”
In a way, the future of apps is already here, with an increasing number of them not on public view at the App Store but quietly being harnessed by teams within private companies and organizations, from giants like Salesforce to stage crews at musical venues to small enterprises like contractors and electricians.
Consultant Richard Carlton helps companies use programs like Apple-owned FileMaker to create their own proprietary apps that allow colleagues to collaborate on a shared database they can all access from their mobile devices.
“These mobile tools allow people who aren’t coders to build their own solutions and share them with their fellow employees,” he said. “For example, we’ve helped plumbers create apps they can use to update their work schedules on their phones. This software lets you sign contracts in the field, take photos and enter them in a database, or do property inspections. And this costs the company a quarter of what they’d pay to have a professional app developer do it.”
In other trends in the coming year, there will be more video ads playing on our smartphone screens, more crowdfunding to launch app startups and more developers leaving Google and Apple to become consultants who’ll build apps for corporate clients like The Home Depot.
“Every company out there is turning to mobile, whether it’s retail or airlines or real estate,” said Shravan Goli, president of the tech-jobs site Dice.com. With big companies storming into the market, the coming years will be tough for the independents, said Craig Hockenberry with app-design firm The Icon Factory.
“The people who want to survive solely off the puzzle game or the camera app are the ones having a problem right now,” he said. “When the App Store opened, our first app sold well because there wasn’t a lot of competition. We were a big fish in a small pond. Now the pond is more like an ocean.”
—-
WHAT’S the future of apps?
We asked five attendees at Apple’s annual Worldwide Developers Conference this week in San Francisco for their take on what’s ahead.
Jenna Hoffstein
Educational app developer, Boston
“We’ll see a broader use of apps in schools, supporting teachers and giving kids more engaging ways to learn math and science.”
Ashok Ramamoorthy
Product manager, India
“All your business will develop around your (enterprise) app. If you’re not taking advantage of that, you’re losing money.”
Igor Ievsiukov
Developer, Ukraine
“Apps will be smarter and they’ll distract the user less. Their functions will be more personalized and personalized more precisely.”
Amy Wardrop
Digital product manager, Sydney
“The future of apps is all about experiential, the actual experience of being human. Wearable health and fitness devices, for example, will provide personal analytics, with more layering of information from both humans and their environment.”
Ashish Singh
Developer, India
“Apps will become part of every aspect of our lives, with virtual-reality apps more prevalent. With an app and a pair of VR glasses, you’ll be able to virtually tour a property for sale, museums or vacation destinations.”

References:http://phys.org/

Water Droplet-Powered Computers Could Run Mini Science Labs

water-computer

A computer made using water and magnets can move droplets around inside itself like clockwork, researchers say. The device demonstrates a new way to merge computer calculations with the manipulation of matter, scientists added.

Whereas conventional microelectronics shuffle electrons around wires, in recent years, scientists have begun developing so-called microfluidic devices that shuffle liquids around pipes. These devices can theoretically perform any operation a conventional electronic microchip can.

Although microfluidic devices are dramatically slower than conventional electronics, the goal is not to compete with electronic computers on traditional computing tasks such as word processing. Rather, the aim is to develop a completely new class of computers to precisely control matter. [Super-Intelligent Machines: 7 Robotic Futures]

“The fundamental limits of computation, such as how fast you can go or how small devices can be, are based in how information has to be represented in physical entities,” study co-author Manu Prakash, a biophysicist at Stanford University, told Live Science. “We flipped that idea on its head — why can’t we use computations to manipulate physical entities?”

Current applications for microfluidic chips include serving as miniaturized chemistry and biology laboratories. Instead of performing experiments with dozens of test tubes, each droplet in a lab-on-a-chip can serve as a microscopic test tube, enabling scientists to conduct thousands of experiments simultaneously, but requiring a fraction of the time, space, materials, cost and effort of a conventional laboratory.

But one major drawback of microfluidic devices is that the droplets of liquid are usually controlled one at a time. Although Prakash and his colleagues previously demonstrated a way to control many droplets on a microfluidic chip simultaneously, until now, the actions of such droplets were not synchronized with each other. That makes these systems prone to errors that prevented the devices from taking on more complex operations.

Now Prakash and his colleagues have developed a way for droplets on microfluidic devices to act simultaneously, in a synchronized manner. The key was using a rotating magnetic field, like a clock.

The core of the new microfluidic chip, which is about half the size of a postage stamp, consists of tiny, soft, magnetic nickel-iron-alloy bars arranged into mazelike patterns. On top of this array of bars is a layer of silicone oil sandwiched between two layers of Teflon. The bars, oil and Teflon layers are in turn placed between two glass slides.

The researchers then carefully injected water droplets into the oil; these droplets were infused with tiny magnetic particles only nanometers, or billionths of a meter, wide. Next, the researchers turned on a rotating magnetic field.

Each time the magnetic field reversed, the bars flipped, drawing the magnetized droplets along specific directions, the researchers said. Each rotation of the magnetic field was very much like a cycle on a clock — for instance, a second hand making a full circle on a clock face. The rotating magnetic field ensured that every droplet ratcheted precisely one step forward with each cycle, moving in perfect synchrony.

A camera recorded the movements and interactions of all the droplets. The presence of a droplet in any given space represents a one in computer data, while the absence of a drop represents a zero; interactions among the droplets are analogous to computations, the researchers said. The layout of the bars on these new microfluidic chips is analogous to the layout of circuits on microchips, controlling interactions among the droplets.

So far, the droplets in this device are as little as 100 microns wide, the same size as the average width of a human hair. The researchers noted their models suggest the devices could ultimately control droplets just 10 microns large. “Making the droplets smaller will allow the chip to carry out more operations,” Prakash said.

The researchers now plan to make a design tool for these droplet circuits available to the public, so that anyone can make them.

“We’re very interested in engaging anybody and everybody who wants to play, to enable everyone to design new circuits based on building blocks we describe in this paper, or [to] discover new blocks,” Prakash said in a statement.

Prakash and his colleagues Georgios Katsikis and James Cybulski, both of Stanford University, detailed their findings June 8 in the journal Nature Physics.

References:http://www.livescience.com/

Centimeter-long origami robot climbs inclines, swims, and carries loads

centimeterlo

At the recent International Conference on Robotics and Automation, MIT researchers presented a printable origami robot that folds itself up from a flat sheet of plastic when heated and measures about a centimeter from front to back.

Weighing only a third of a gram, the robot can swim, climb an incline, traverse rough terrain, and carry a load twice its weight. Other than the self-folding plastic sheet, the robot’s only component is a permanent magnet affixed to its back. Its motions are controlled by external magnetic fields.
“The entire walking motion is embedded into the mechanics of the robot body,” says Cynthia R. Sung, an MIT graduate student in electrical engineering and computer science and one of the robot’s co-developers. “In previous [origami] robots, they had to design electronics and motors to actuate the body itself.”
Joining Sung on the paper describing the robot are her advisor, Daniela Rus, the Andrew and Erna Viterbi Professor in MIT’s Department of Electrical Engineering and Computer Science; first author Shuhei Miyashita, a postdoc in Rus’ lab; Steven Guitron, who just received his bachelor’s degree in mechanical engineering from MIT; and Marvin Ludersdorfer of the Technical University of Munich.
Fantastic Voyage

The robot’s design was motivated by a hypothetical application in which tiny sheets of material would be injected into the human body, navigate to an intervention site, fold themselves up, and, when they had finished their assigned tasks, dissolve. To that end, the researchers built their prototypes from liquid-soluble materials. One prototype robot dissolved almost entirely in acetone (the permanent magnet remained); another had components that were soluble in water.
“We complete the cycle from birth through life, activity, and the end of life,” Miyashita says. “The circle is closed.”
In all of the researchers’ prototypes, the self-folding sheets had three layers. The middle layer always consisted of polyvinyl chloride, a plastic commonly used in plumbing pipes, which contracts when heated. In the acetone-soluble prototype, the outer layers were polystyrene.
Slits cut into the outer layers by a laser cutter guide the folding process. If two slits on opposite sides of the sheet are of different widths, then when the middle layer contracts, it forces the narrower slit’s edges together, and the sheet bends in the opposite direction. In their experiments, the researchers found that the sheet would begin folding at about 150 degrees Fahrenheit.
Once the robot has folded itself up, the proper application of a magnetic field to the permanent magnet on its back causes its body to flex. The friction between the robot’s front feet and the ground is great enough that the front feet stay fixed while the back feet lift. Then, another sequence of magnetic fields causes the robot’s body to twist slightly, which breaks the front feet’s adhesion, and the robot moves forward

Outside control

In their experiments, the researchers positioned the robot on a rectangular stage with an electromagnet at each of its four corners. They were able to vary the strength of the electromagnets’ fields rapidly enough that the robot could move nearly four body lengths a second.

In addition to the liquid-soluble versions of their robot, the researchers also built a prototype whose outer layers were electrically conductive. Inspired by earlier work from Rus and Miyashita, the researchers envision that a tiny, conductive robot could act as a tiny sensor. Contact with other objects—whether chemical accretions in a mechanical system or microorganisms or cells in the body—would disrupt a current passing through the robot in a characteristic way, and that electrical signal could be relayed to human operators.
“Making small robots is particularly challenging, because you don’t just take off-the-shelf components and bolt them together,” says Hod Lipson, a professor of mechanical and aerospace engineering at Cornell University, who studies robotics. “It’s a challenging angle of robotics, and they’ve been able to solve it.”
“They use digital manufacturing techniques so that the intelligence of the manufacturing is embedded in the material,” Lipson adds. “I think the techniques they describe would scale to smaller and smaller dimensions, so they by no means have reached a limit.”

References:http://phys.org/

A computer algorithm to quantify creativity in art networks

acomputeralg

A team of researchers at Rutgers University has taken on the novel task of getting a computer to rate paintings made by the masters, based on their creativity. They have written a paper describing their approach and the results they have obtained in running their algorithm and have posted it on the preprint server arXiv.

The value of art lies in the eye of the beholder, some may find a particular painting moves them to tears, while another feels nothing—such is the intangible nature of the human mind and its reaction to stimuli. Creativity, on the other hand, is a little more easily recognized, whether in art, the sciences or other areas. In this new effort the team at Rugters sought to bring some science to the fine art of creativity recognition, as it applies to one of the most recognized fine arts—paintings done by masters over the years. Traditionally, labeling a work of art as creative has fallen to art scholars with years of training, background and love of the work—it has to have something new, of course, but it must also, according to the researchers, have demonstrated some degree of influence, i.e. be copied by others that come after. They set out to create an algorithm that once finished could rate the works by masters, based on nothing but creativity.
To create that algorithm, the team started with what are known as classemes—where a computer recognizes an object in a picture and assigns it to a particular category. Next, they found a way to access a huge database of famous paintings that was easily accessible, Wikiart, which has among other things, approximately 62,000 pictures of famous paintings. Then finally, they applied theoretical work being done with network science to help with figuring out which paintings were a clear influence in the creation of other paintings.
Putting it all together and running the algorithm resulted in generating a list of paintings with rankings based on creativity. The approach apparently worked, as the researchers report that for the most part, their algorithm results matched very closely with art expert assessments over the years, though there were a few exceptions here and there. The team suggests the algorithm could be used in other contexts as well, such as sculpture, literature and likely other science based applications.

References:http://phys.org/