Read Write Web

Syndicate content
Updated: 40 min ago

Why Do Some Old Programming Languages Never Die?

14 hours 40 min ago

Many of today’s most well-known programming languages are old enough to vote. PHP is 20. Python is 23. HTML is 21. Ruby and JavaScript are 19. C is a whopping 42 years old.

Nobody could have predicted this. Not even computer scientist Brian Kernighan, co-author of the very first book on C, which is still being printed today. (The language itself was the work of Kernighan's co-author Dennis Ritchie, who passed away in 2011.)

“I dimly recall a conversation early on with the editors, telling them that we’d sell something like 5,000 copies of the book,” Kernighan told me in a recent interview. “We managed to do better than that. I didn’t think students would still be using a version of it as a textbook in 2014.”

What’s especially remarkable about C's persistence is that Google developed a new language, Go, specifically to more efficiently solve the problems C solves now. Still, it’s hard for Kernighan to imagine something like Go outright killing C no matter how good it is.

“Most languages don’t die—or at least once they get to a certain level of acceptance they don’t die," he said. "C still solves certain problems better than anything else, so it sticks around.”

Write What You Know

Why do some computer languages become more successful than others? Because developers choose to use them. That’s logical enough, but it gets tricky when you want to figure out why developers choose to use the languages they do.

Ari Rabkin and Leo Meyerovich are researchers from, respectively, Princeton and the University of California at Berkeley who devoted two years to answering just that question. Their resulting paper, Empirical Analysis of Programming Language Adoption, describes their analysis of more than 200,000 Sourceforge projects and polling of more than 13,000 programmers.

Their main finding? Most of the time programmers choose programming languages they know.

“There are languages we use because we’ve always used them,” Rabkin told me. “For example, astronomers historically use IDL [Interactive Data Language] for their computer programs, not because it has special features for stars or anything, but because it has tremendous inertia. They have good programs they’ve built with it that they want to keep.”

In other words, it’s partly thanks to name recognition that established languages retain monumental staying power. Of course, that doesn’t mean popular languages don’t change. Rabkin noted that the C we use today is nothing like the language Kernighan first wrote about, which probably wouldn’t be fully compatible with a modern C compiler.

“There’s an old, relevant joke in which an engineer is asked which language he thinks people will be using in 30 years and he says, ‘I don’t know, but it’ll be called Fortran’,” Rabkin said. “Long-lived languages are not the same as how they were when they were designed in the '70s and '80s. People have mostly added things instead of removed because that doesn’t break backwards compatibility, but many features have been fixed.”

This backwards compatibility means that not only can programmers continue to use languages as they update programs, they also don’t need to go back and rewrite the oldest sections. That older ‘legacy code’ keeps languages around forever, but at a cost. As long as it’s there, people’s beliefs about a language will stick around, too.

PHP: A Case Study Of A Long-Lived Language

Legacy code refers to programs—or portions of programs—written in outdated source code. Think, for instance, of key programming functions for a business or engineering project that are written in a language that no one supports. They still carry out their original purpose and are too difficult or expensive to rewrite in modern code, so they stick around, forcing programmers to turn handsprings to ensure they keep working even as other code changes around them.

Any language that's been around more than a few years has a legacy-code problem of some sort, and PHP is no exception. PHP is an interesting example because its legacy code is distinctly different from its modern code, in what proponents say—and critics admit—is a huge improvement.

Andi Gutmans is a co-inventor of the Zend Engine, the compiler that became standard by the time PHP4 came around. Gutmans said he and his partner originally wanted to improve PHP3, and were so successful that the original PHP inventor, Rasmus Lerdorf, joined their project. The result was a compiler for PHP4 and its successor, PHP5.

As a consequence, the PHP of today is quite different from its progenitor, the original PHP. Yet in Gutmans' view, the base of legacy code written in older PHP versions keeps alive old prejudices against the language—such as the notion that PHP is riddled with security holes, or that it can't "scale" to handle large computing tasks.

See also: PHP, Once The Web's Favorite Programming Language, Is On The Wane

"People who criticize PHP are usually criticizing where it was in 1998,” he says. “These people are not up-to-date with where it is today. PHP today is a very mature ecosystem.”

Today, Gutmans says, the most important thing for him as a steward is to encouraging people to keep updating to the latest versions. “PHP is a big enough community now that you have big legacy code bases," he says. "But generally speaking, most of our communities are on PHP3 at minimum.”

The issue is that users never fully upgrade to the latest version of any language. It’s why many Python users are still using Python 2, released in 2000, instead of Python 3, released in 2008. Even after six years major users like Google still aren’t upgrading. There are a variety of reasons for this, but it made many developers wary about taking the plunge.

See also: Why The JavaScript World Is Still Waiting For Node.js 1.0

“Nothing ever dies," Rabkin says. "Any language with legacy code will last forever. Rewrites are expensive and if it’s not broke don’t fix it.”

Developer Brains As Scarce Resources

Of course, developers aren’t choosing these languages merely to maintain pesky legacy code. Rabkin and Meyerovich found that when it comes to language preference, age is just a number. As Rabkin told me:

A thing that really shocked us and that I think is important is that we grouped people by age and asked them how many languages they know. Our intuition was that it would gradually rise over time; it doesn’t. Twenty-five-year-olds and 45-year-olds all know about the same number of languages. This was constant through several rewordings of the question. Your chance of knowing a given language does not vary with your age.

In other words, it’s not just old developers who cling to the classics; young programmers are also discovering and adopting old languages for the first time. That could be because the languages have interesting libraries and features, or because the communities these developers are a part of have adopted the language as a group.

“There’s a fixed amount of programmer attention in the world,” said Rabkin. “If a language delivers enough distinctive value, people will learn it and use it. If the people you exchange code and knowledge with you share a language, you’ll want to learn it. So for example, as long as those libraries are Python libraries and community expertise is Python experience, Python will do well.”

Communities are a huge factor in how languages do, the researchers discovered. While there's not much difference between high level languages like Python and Ruby, for example, programmers are prone to develop strong feelings about the superiority of one over the other.

“Rails didn’t have to be written in Ruby, but since it was, it proves there were social factors at work,” Rabkin says. “For example, the thing that resurrected Objective-C is that the Apple engineering team said, ‘Let’s use this.’ They didn’t have to pick it.”

Through social influence and legacy code, our oldest and most popular computer languages have powerful inertia. How could Go surpass C? If the right people and companies say it ought to.

“It comes down to who is better at evangelizing a language,” says Rabkin.

Lead image by Ken Fager

Categories: Technology

Why Do Some Old Programming Languages Never Die?

14 hours 40 min ago

Many of today’s most well-known programming languages are old enough to vote. PHP is 20. Python is 23. HTML is 21. Ruby and JavaScript are 19. C is a whopping 42 years old.

Nobody could have predicted this. Not even computer scientist Brian Kernighan, co-author of the very first book on C, which is still being printed today. (The language itself was the work of Kernighan's co-author Dennis Ritchie, who passed away in 2011.)

“I dimly recall a conversation early on with the editors, telling them that we’d sell something like 5,000 copies of the book,” Kernighan told me in a recent interview. “We managed to do better than that. I didn’t think students would still be using a version of it as a textbook in 2014.”

What’s especially remarkable about C's persistence is that Google developed a new language, Go, specifically to more efficiently solve the problems C solves now. Still, it’s hard for Kernighan to imagine something like Go outright killing C no matter how good it is.

“Most languages don’t die—or at least once they get to a certain level of acceptance they don’t die," he said. "C still solves certain problems better than anything else, so it sticks around.”

Write What You Know

Why do some computer languages become more successful than others? Because developers choose to use them. That’s logical enough, but it gets tricky when you want to figure out why developers choose to use the languages they do.

Ari Rabkin and Leo Meyerovich are researchers from, respectively, Princeton and the University of California at Berkeley who devoted two years to answering just that question. Their resulting paper, Empirical Analysis of Programming Language Adoption, describes their analysis of more than 200,000 Sourceforge projects and polling of more than 13,000 programmers.

Their main finding? Most of the time programmers choose programming languages they know.

“There are languages we use because we’ve always used them,” Rabkin told me. “For example, astronomers historically use IDL [Interactive Data Language] for their computer programs, not because it has special features for stars or anything, but because it has tremendous inertia. They have good programs they’ve built with it that they want to keep.”

In other words, it’s partly thanks to name recognition that established languages retain monumental staying power. Of course, that doesn’t mean popular languages don’t change. Rabkin noted that the C we use today is nothing like the language Kernighan first wrote about, which probably wouldn’t be fully compatible with a modern C compiler.

“There’s an old, relevant joke in which an engineer is asked which language he thinks people will be using in 30 years and he says, ‘I don’t know, but it’ll be called Fortran’,” Rabkin said. “Long-lived languages are not the same as how they were when they were designed in the '70s and '80s. People have mostly added things instead of removed because that doesn’t break backwards compatibility, but many features have been fixed.”

This backwards compatibility means that not only can programmers continue to use languages as they update programs, they also don’t need to go back and rewrite the oldest sections. That older ‘legacy code’ keeps languages around forever, but at a cost. As long as it’s there, people’s beliefs about a language will stick around, too.

PHP: A Case Study Of A Long-Lived Language

Legacy code refers to programs—or portions of programs—written in outdated source code. Think, for instance, of key programming functions for a business or engineering project that are written in a language that no one supports. They still carry out their original purpose and are too difficult or expensive to rewrite in modern code, so they stick around, forcing programmers to turn handsprings to ensure they keep working even as other code changes around them.

Any language that's been around more than a few years has a legacy-code problem of some sort, and PHP is no exception. PHP is an interesting example because its legacy code is distinctly different from its modern code, in what proponents say—and critics admit—is a huge improvement.

Andi Gutmans is a co-inventor of the Zend Engine, the compiler that became standard by the time PHP4 came around. Gutmans said he and his partner originally wanted to improve PHP3, and were so successful that the original PHP inventor, Rasmus Lerdorf, joined their project. The result was a compiler for PHP4 and its successor, PHP5.

As a consequence, the PHP of today is quite different from its progenitor, the original PHP. Yet in Gutmans' view, the base of legacy code written in older PHP versions keeps alive old prejudices against the language—such as the notion that PHP is riddled with security holes, or that it can't "scale" to handle large computing tasks.

See also: PHP, Once The Web's Favorite Programming Language, Is On The Wane

"People who criticize PHP are usually criticizing where it was in 1998,” he says. “These people are not up-to-date with where it is today. PHP today is a very mature ecosystem.”

Today, Gutmans says, the most important thing for him as a steward is to encouraging people to keep updating to the latest versions. “PHP is a big enough community now that you have big legacy code bases," he says. "But generally speaking, most of our communities are on PHP3 at minimum.”

The issue is that users never fully upgrade to the latest version of any language. It’s why many Python users are still using Python 2, released in 2000, instead of Python 3, released in 2008. Even after six years major users like Google still aren’t upgrading. There are a variety of reasons for this, but it made many developers wary about taking the plunge.

See also: Why The JavaScript World Is Still Waiting For Node.js 1.0

“Nothing ever dies," Rabkin says. "Any language with legacy code will last forever. Rewrites are expensive and if it’s not broke don’t fix it.”

Developer Brains As Scarce Resources

Of course, developers aren’t choosing these languages merely to maintain pesky legacy code. Rabkin and Meyerovich found that when it comes to language preference, age is just a number. As Rabkin told me:

A thing that really shocked us and that I think is important is that we grouped people by age and asked them how many languages they know. Our intuition was that it would gradually rise over time; it doesn’t. Twenty-five-year-olds and 45-year-olds all know about the same number of languages. This was constant through several rewordings of the question. Your chance of knowing a given language does not vary with your age.

In other words, it’s not just old developers who cling to the classics; young programmers are also discovering and adopting old languages for the first time. That could be because the languages have interesting libraries and features, or because the communities these developers are a part of have adopted the language as a group.

“There’s a fixed amount of programmer attention in the world,” said Rabkin. “If a language delivers enough distinctive value, people will learn it and use it. If the people you exchange code and knowledge with you share a language, you’ll want to learn it. So for example, as long as those libraries are Python libraries and community expertise is Python experience, Python will do well.”

Communities are a huge factor in how languages do, the researchers discovered. While there's not much difference between high level languages like Python and Ruby, for example, programmers are prone to develop strong feelings about the superiority of one over the other.

“Rails didn’t have to be written in Ruby, but since it was, it proves there were social factors at work,” Rabkin says. “For example, the thing that resurrected Objective-C is that the Apple engineering team said, ‘Let’s use this.’ They didn’t have to pick it.”

Through social influence and legacy code, our oldest and most popular computer languages have powerful inertia. How could Go surpass C? If the right people and companies say it ought to.

“It comes down to who is better at evangelizing a language,” says Rabkin.

Lead image by Ken Fager

Categories: Technology

Proof That Instagram's Hyperlapse App Makes Everyone's Videos Better

15 hours 39 min ago
<em>Editor's note: This post was originally published by our partners at <a href="http://www.popsugar.com/tech/Creative-Hyperlapse-Videos-35580606">PopSugar Tech</a>.</em>

Instagram has been an outlet for photographic and video storytelling since it was first introduced, and though we've seen new photo apps to make your Instas shine come and go over the past few years, none have been quite as hyped up as the latest and maybe greatest—Hyperlapse.

The user-friedly app from Instagram itself has simplified the process of producing sleek, smooth, and creative high-quality time-lapse videos. Check it out in action below.

All the little ants are marching.

Jimmy Fallon tried the app out on The Tonight Show.

You literally can't lose with a cute dog video.

This is kind of like The Office meets The Maze Runner.

The app works wonders with shots of water.

Karlie Kloss coding on a laptop in hyperspeed!

And if you need more help getting started, iJustine has got you covered.

More stories from PopSugar Tech:

How To Stream All The Music You Want, Without Burning DataRingly's High-Tech Jewelry Gets An Edgy New LookFor Perfect Posture, Wear This Gadget4 Throwback Apps Yo Indulge Your Inner 90's Kid TiVo Now Has A DVR For TV Antennas

Categories: Technology

Apple Is Walking Into Payments Naked

20 hours 34 min ago

The drumbeat of reports that Apple is entering the payments business is growing louder and louder. The company has deals with Visa and MasterCard, Bloomberg reports, and American Express, says Recode. 

An announcement should be just a week away: The company has invited journalists to an event in Silicon Valley on Tuesday, September 9.

So are we that close to using our iPhones as wallets?

Adam and Eve and the Serpent. (Photo by <a href="https://www.flickr.com/photos/mikecogh/5863997874/">mikecogh</a>)What Apple Must Bite Into

The technology is one thing. Apple has any number of means of transmitting payment data—via Bluetooth, NFC, or just over the Internet. And it has long experience in e-commerce, dating back to the late-'90s creation of the online Apple Store, which predates its physical retail outlets.

See also: Apple May Tap Old Tech For Payments

But Apple has a number of challenges it must solve before it can enter the payments market. Those fall into three broad areas: regulation, fraud, and customer service.

Apple doesn't appear to have taken any of the obvious public steps someone getting into the business of moving money would need to do. For example, it hasn't registered with its home state of California as a money transmitter, as Amazon, Google, Square, and PayPal have. (Even Airbnb, the lodging marketplace, has registered, since it holds rental funds for a few days before disbursing them to hosts.)

Apple can argue that it doesn't need to register if a banking partner is handling the financial aspects of its payments business. And it will need a partner—like Chase, Wells Fargo, or Bank of America—if it wants to process Visa and MasterCard transactions for other retailers. (American Express is different, since it both issues cards and processes transactions: Apple can deal directly with that company for transactions involving its cardmembers.)

Fraud is another issue. Apple will likely find ways to protect payment-card details from obvious forms of hacking. But what about merchants and consumers who are out to trick each other, selling bum merchandise or using stolen card numbers? Apple or its banking partners or both will have to figure that out—and any new payment system is attractive to fraudsters.

There's also Apple's lousy track record with digital security, freshly in the news thanks to the apparent hacks of several celebrities' iCloud photo backups. In 2012, Wired writer Mat Honan lost years' worth of documents when his Apple accounts were hacked, allowing hackers to remotely wipe clean his iPhone, iPad, and MacBook. 

Apple is far from alone in having security problems, and has been working to improve its systems—it just fixed a vulnerability that may have allowed hackers to get their hands on celebrities' intimate photos. But none of this is reassuring when you think about using an account Apple is guarding to make online purchases. If anything, handling payments makes Apple a richer target for hackers.

And then there's customer service. What are merchants supposed to do if a transaction doesn't go through? Line up at the Apple Store? In 2012, Apple laid out plans to hire 3,600 more workers in its Austin, Texas campus, largely to staff up its AppleCare support operations. Those phone lines will be busy.

What Apple Brings To The Cash Register

Apple has undeniable assets to bring to a payments business—chiefly its 800 million customer accounts on iTunes with credit cards on file. 

It also has intriguing experiments under way, like its Apple Store app, which lets you buy things in the store with your iTunes account without needing to swipe a card or even talk to an employee: You just pay on your phone, grab, and go. It also has iTunes Pass, a program which lets you walk into an Apple Store with cash and load it up on your iTunes account. Especially in developing countries, a link between cash and digital spending could be transformative.

Connecting Apple's physical infrastructure with a payment service could be its biggest advantage. Small merchants pick up iPads and card readers at the Apple Store already. Amazon, Google, and PayPal don't have that kind of physical touch point with businesses.

But before you can start paying with your iPhone, Apple has a lot of questions to answer: Where can you buy things? What will it cost merchants? And who will help if something goes wrong?

Photo by mikecogh

Categories: Technology

The Secret To Hiring Great Developers Is Hiding In Plain Sight

22 hours 46 sec ago

Everyone wants to hire more engineers, including you, driving software salaries through the roof. Unfortunately, it's very likely that you don't have the slightest clue how to recruit well.

Take heart. While your ability to spot real talent in an interview may be weak, open source makes it relatively easy to see who can actually code, and who simply knows how to answer useless, abstruse questions. 

Engineering Success

Finding great technical talent is important. In fact, in a world increasingly run by developers, I'd argue that it's the most important thing any company does, whether it's a technology vendor or a manufacturer of cars or clothes. The better the engineering, the better the product, and the better the product, the less reliant your company needs to be on sales and marketing, at least, early on. 

Or, as venture capitalist Fred Wilson puts it, "Marketing is for companies who have sucky products."

The problem, of course, is that everyone is scouring the planet for the same engineers. Which, in turn, has driven the cost of developer salaries way up:

Chart by CNBC

There are all sorts of gimmicks to finding great engineers. Google, for example, used to impose complex brainteasers on job applicants—only to discover they were utterly useless, as Laszlo Bock, senior vice president of people operations at Google, said:

We found that brainteasers are a complete waste of time. They don't predict anything. They serve primarily to make the interviewer feel smart.

Brainteasers, then, are out.

But, as Bock went on to highlight, so are brand-name schools, test scores and grades. "Worthless," he declares. In fact, the whole hiring process is a "complete random mess."

So how can you fix this?

Changing The Interview Process

One way is to change the way you interview. As Laurie Voss, the CTO of NPM, recently argued, "You are bad at giving technical interviews.... You’re looking for the wrong skills, hiring the wrong people, and actively screwing yourself and your company."

Sadly, she's probably right. And not just about you. We're all pretty bad at technical interviews (or interviews, generally, for that matter). 

The gist of her post is that too often we "over-valu[e] present skills and under-valu[e] future growth," hiring people based on what they've done (or went to school) rather than what they can do. Or, as she summarizes:

1) Many interview techniques test skills that are at best irrelevant to real working life; 2) you want somebody who knows enough to do the job right now; 3) or somebody smart and motivated enough that they can learn the job quickly; 4) you want somebody who keeps getting better at what they do; 5) your interview should be a collaborative conversation, not a combative interrogation; 6) you also want somebody who you will enjoy working with; 7) it’s important to separate “enjoy working with” from “enjoy hanging out with;” 8) don’t hire [jerks], no matter how good they are; 9) if your team isn’t diverse, your team is worse than it needed to be; and 10) accept that hiring takes a really long time and is really, really hard.

Bock echoes this, indicating that Google's experience has been that behavioral interviews work best. Rather than asking a candidate to remember some obscure computer science fact, Google now starts with a question like:

"Give me an example of a time when you solved an analytically difficult problem.” The interesting thing about the behavioral interview is that when you ask somebody to speak to their own experience, and you drill into that, you get two kinds of information. One is you get to see how they actually interacted in a real-world situation, and the valuable “meta” information you get about the candidate is a sense of what they consider to be difficult.

This is a great approach, but there's a way to take it one step further.

Open Source Your Interview

The best place to see how engineers solve problems in the real world is in the open-source projects to which they contribute. Open-source communities offer a clear view into an engineer's interactions with others, the quality of her code and a history of how she tackles hard problems, both individually and in groups.

No guesswork. No leap of faith. Her work history is all there on GitHub and message boards.

But open source offers other benefits, too. As Netflix's former head of cloud operations, Adrian Cockroft, once detailed, open source helps to position Netflix as a technology leader and to "hire, retain and engage top engineers." How? Well, the best engineers often want to work on open source. Providing that "perk" is essential to hiring great technical talent. 

Interviews are important to ascertain cultural fit, among other things, but they shouldn't be a substitute for the more informative work of analyzing a developer's open source work. 

And if they have none, well, that tells you something, too. A colleague at a former company told me that the best engineers were all on GitHub, not LinkedIn. While perhaps an overstatement, there's a fair amount of truth to it, too. 

In sum, you should be able to get to know your next engineering hire through open-source development far better than through any interview process, no matter how detailed. 

Photo via Shutterstock

Categories: Technology

Watch Out: Mobile Search Will Make Facebook Stalking Easier Than Ever

Fri, 2014-08-29 19:08

You may want to take a quick gander through your entire Facebook history, because Facebook is testing a feature that makes your status updates searchable on mobile. And I bet there are some updates you just wouldn’t want to be found.

The feature isn't entirely new; Facebook began making all your updates searchable on the Web last year as part of its Graph Search function. That uses Facebook’s massive social database to churn out results to highly specific queries like, “Friends who like pizza." 

Searching posts and status updates isn't available for everyone yet, even on the Web. Eventually, though, Facebook will make everyone's history transparent—exposing, along the way, the years of posts you've shared with the social network, either publicly or with friends.

See also: Your Transparent Life: Facebook Makes All Posts Searchable

The mobile feature, first spotted by Bloomberg, will show posts that you would see if you combed through Facebook histories anyway, just like on the Web. A Facebook spokesperson told me, “We’re testing an improvement to search on mobile. In this test you can use keywords to search for posts you’re in the audience for on Facebook.”

How To Prevent It

If you don't want your entire post history discoverable via Graph Search, don't worry. Facebook gives you the ability to change the audience for all your past posts. 

See also: Facebook, If You're Serious About Privacy Controls, Let Me Control Them

1. On both Web and mobile, the setting to make all your past posts viewable by friends only can be found under "Settings."

2. Click "Privacy"

3. Under "Who can see my stuff?" click "Limit past posts."

4. Facebook will then describe what this will do. But in a nutshell it means: Only friends can see your posts, no matter what ones were public in the past. Click "Limit old posts."

5. Facebook will ask you one more time if you're sure you want the audience on your posts limited. Yes, you do. Confirm it.

Searching past posts might be the best—or worst—thing to happen to the timeline since Facebook introduced it back in September 2011. It will be helpful for finding your own status updates from years ago (remember that one time we went camping on the beach?), but it will also let other people find them, too. And if you don’t remember every detail you’ve posted for the last decade, revealing them could get embarrassing. 

Lead image by michell zappa; screenshots by Selena Larson for ReadWrite

Categories: Technology

Google Challenges Amazon's Drone Delivery Program With Project Wing

Fri, 2014-08-29 15:22

Google has been secretly testing delivery by drone, the company announced Thursday.

A team of engineers at Google X, the technology company’s long-range research lab, safely carried out more than 30 1-kilometer test flights this month. The deliveries, consisting of items ranging from a chocolate bar to first aid, took place in Queensland, Australia to avoid the Federal Aviation Administration’s strict U.S. restrictions on drones.

See also: Why Commercial Drones Are Stuck In Regulatory Limbo

Now that Amazon has almost convinced the world its delivery drones aren’t a publicity stunt, the world may be ready to accepting Google at its word.

The Google X drone is a quadcopter, but it looks nothing like the ones many U.S. hobbyists use for aerial photography and other projects, or Amazon’s Prime Air octocopter. Instead, it relies on fixed wings for fast forward flight, and its four rotors for vertical takeoff and landing. The company released a YouTube video to show how it flies.

Project Wing, as the video labels the drone, is capable of carrying a roughly four-pound package. Meanwhile, Amazon says Prime Air can carry up to five pounds. Despite the design differences, it’s apparent that Google’s drone could realistically compete with Amazon’s.

See also: Amazon Tells The Feds It Really Wants To Test Drone Delivery

According to Astro Teller, Google X’s Captain of Moonshots—what Google calls its biggest, craziest ideas—delivery is just the beginning. Google envisions being able to use the drones for humanitarian solutions, too.

“Even just a few of these, being able to shuttle nearly continuously could service a very large number of people in an emergency situation,” Teller told the BBC.

Screenshot via Google X

Categories: Technology

What Sort Of Man Shares Playboy? (He Friends His Mom On Facebook)

Fri, 2014-08-29 14:31

Playboy relaunched its website on Wednesday, and one thing's for sure. This isn't your grandpappy's gentleman's magazine. 

Well, obviously. It's on the Internet—as opposed to a moldy box in the garage or hastily stuffed under your mattress where you shamefully hope Mom doesn't find it. As such, Playboy.com is ready for its Facebook close-up, fully clothed and safe-for-work (or SFW, as the kids say). Just in case you want to share its family-friendly content on any social network where Mom might see it. 

Cory Jones, Playboy's senior VP of digital content, aptly explained to AdAge the new site's savvy social media strategy. To wit: "Everyone's mom is on Facebook."

Indeed.

"We're having some kind of moment in American sexual culture," Rick Schindler, culture critic and author of “Fandemonium,” told ReadWrite. "Would Don Draper of Mad Men share Playboy with his mom? I guess, given she was a hooker."

This, however, is a new millennium. Take, for example, Playboy's feminist friendly "Should You Catcall Her?" that went went viral this week, a fun and funny flowchart which informs readers there are really only two circumstances in which such behavior is OK. (SPOILER ALERT: One of them is if she's actually a cat.) Lots of people and media outlets shared it; if anyone's mother took offense, she kept it to herself.

That's the way things are now on Playboy.com. When you click on a Facebook link to some some fascinating Playboy article, whether one of its branded 20Q (20 questions) celebrity interviews or its growing collection of "sharable" inforgraphics,  there's little to no chance of stumbling upon scantily clad women.

(Here, for example, is a link to the Playboy 20Q with Idris Elba of The Wire and Mandela fame, which you should feel perfectly comfortable with clicking on while at work, even if you work in one of those free-range call-center office-pen dealies so popular now.)

Notably, the lack of lady skin doesn't include the site's sponsored listicle for Sin City: A Dame To Kill For, featuring a scantily clad and provocatively posed Jessica Alba. (Of course, this is the same Sin City posters your kid can see when you open the newspaper.)

"What we're going to be doing is, if you go to a Playboy.com link that's not around girls, there won't be girls around the page," Jones told AdAge. It's the obvious next step in Playboy's savvy social media strategy to ingratiate itself with the ladies, who share on Facebook just the eensiest bit more than the gents, according to the Pew Research Internet Center.

The 60-year-old printed Playboy magazine is still very much alive with naked ladies, and still accounts for most of the empire's income, AdAge reported. The new digital strategy for its 20-year-old website, however, is aimed at upping the social sharing most media outlets increasingly value. "The editorial mantra for content is to ask ourselves, 'Would you send this to a friend?'" Jones said. 

More importantly, would you send this to your mom?

Lead image by Joel Kramer

Categories: Technology

Friday Fun: Build Your First Chrome Extension

Fri, 2014-08-29 14:01

Google Chrome is the most popular Web browser in the world. Part of its appeal comes from its ability to let you fully customize your browsing experience with a slew of extensions. Extensions are small, lightweight programs that personalize your Chrome installation with new features.

You’ve probably already downloaded an extension or two. But did you know it’s almost as easy to build your own? Chrome extensions are written in a relatively beginner-friendly language—JavaScript—and require only two files to function.

Since they’re so easy to build, there are currently more than 53,000 extensions in the Chrome Web Store, ranging from productivity tools to stupid entertainments.

Today's Project

Today we’re going to build a Chrome extension that isn’t particularly useful, though it's sort of funny. We’ll be transforming Steven Frank’s Cloud to Butt Plus extension, which edits every Web page you visit by replacing the phrase “the cloud” with “my butt.” You can judge the results for yourself.

With Frank’s permission, we’ll be creating a derivative work out of his Cloud to Butt GitHub repository in order to build a “find and replace” extension of our own.

Since I rarely build a coding project that isn’t trolling my coworkers in some way, my example envisions the Web the way my Paleo editor Owen Thomas probably sees it. The Paleo diet puts carbohydrates off limits, so I decided to make his dietary choices simple by making “bread,” “pasta,” and related taboo foodstuffs less appealing to him on the Web.

My finished extension is on GitHub for anyone who wants to use it. Here’s how it works.

Anatomy Of An Extension

Extensions piggy-back off of existing Chrome functionality to add new features. This means anyone can build an extension using HTML, CSS, and JavaScript, without having to learn to work with Chrome’s native code. As the Chrome Developer site promises:

If you've ever built a web page, you should feel right at home with extensions pretty quickly.

That’s quite an assertion, and it certainly depends on the complexity of the extension you want to build. Still, all you need for a basic extension are these:

1. A manifest.json file. Here, .json stands for JavaScript Object Notation. This manifest file stores metadata about our extension and shows Chrome how to use it. Every manifest file includes the extension’s name and description for Chrome Web Store browsers. After that, it declares dependencies, permissions, and any browser actions the extension will perform.

2. A JavaScript or HTML file. Here’s where you write the program detailing what your extension does. In the example the Chrome Developer site gives, it’s popup.html, a page that delivers cute cat photos to extension-users. For more complex extensions, it’s a JavaScript file containing a program that delivers the meat of the extension.

3. An icon. Actually, this is optional, but it's helpful and certainly looks cute when your extension is installed. For best results, save an icon as three square images at resolutions of 16px, 28px, and 128px.

Building Manifest.json

At its very minimum, a manifest file needs only to include a name and a version. At 17 lines, ours does a little more. (Here's the full thing in one place.)

This part includes all the metadata:

{ "manifest_version": 2, "name": "Caaaarbs", "version": "1.0", "description": "Paleo's best friend.", "icons": { "16": "images/carbs16.png", "48": "images/carbs48.png", "128": "images/carbs128.png" },

Manifest_version refers to the version of the file format we’re using. Chrome requires that you use version 2, so that’s what we’ve indicated.

Next comes the extension name, version, and description. These are really up to you.

After that, I listed out the extension’s icon sizes. First, I picked an image that I thought fit my extension—a royalty-free vector graphic of a croissant—and then sized it in Photoshop three times. Now, Chrome automatically puts the correct size of the icon where it is needed.

Here's the rest of the file:

"content_scripts": [ { "matches": ["*://*/*"], "js": ["myscript.js"], "run_at": "document_end" } ] }

These are the content scripts that make the extension tick. The first one here simply indicates that my extension will do its thing on any website. Under different circumstances, you could edit the asterisk wildcards to limit use of the extension to particular pages—you know, like http://readwrite.com.

The second line indicates that manifest.json will read in the extension's underlying program from a JavaScript file named myscript.js. That’s where the whole “find and replace” function lives.

Finally, the third line instructs my extension to run after the full page has loaded in the browser window. If it ran before I brought up a site, some of the words I want to find and replace might not have loaded yet!

Building Myscript.js

This file may be 40 lines long (see it here), but it’s mainly home to two JavaScript functions. In programming, a function is a reusable bit of code that performs a specific task.

The first function, called walk, executes an action that JavaScript programmers refer to as “walking the DOM.” DOM stands for Document Object Model, which is a code-based representation of a Web page and every element—text, images, form fields, and so forth—on it. It sort of resembles an upside-down tree, with a single trunk at the top and a bunch of ordered code "branches" below.

The walk function explores the whole tree, starting at the trunk and moving down to the end of the first branch, then back up until it finds another branch to examine. Basically, it's crawling all the data on the page to locate the textual elements.

That's where the second function, handleText, comes in. When walk finds some text, handleText scans for the words we want to replace, and then replaces them wherever it finds them.

How does it know which words to replace? We specified that this way:

v = v.replace(/\bbread\b/g, "caaaaarbs");

This is one of the five lines that specifies the words I want to swap out. You can choose any number of words for substitution, though each one will need a line like the one above. (It's not the most graceful program ever written, but it is straightforward.)

Some technical details, for those who are interested: "v" is a variable that stores a temporary copy of “textNode.nodeValue=”—i.e., the text in a particular text element called "textNode." The function v.replace rewrites the text in that element by replacing the first string (everything inside the parentheses before the comma) with the second string (the word "caaaaarbs"). The first string in the example above is a dense bit of code that identifies all text matching "bread" and then instructs the function to replace every word that is a match to this one.

At the end of the function, the temporary value stored in "v" gets copied back to "textNode.nodeValue" and then written into the code representation of the Web page—which then displays your change in the browser.

Uploading to Chrome

Collect your manifest.json, myscript.jpg, and your icons in a new folder by themselves. Now, navigate to chrome://extensions/ in your browser window.

Now, click the checkbox to put Extensions in “Developer Mode.” This will give you a few more options regarding what you can do with your extensions.

Click “Load unpacked extension…” and navigate to your Chrome extension folder. If all is well, it should upload without a hitch. If it returns an error, there's most likely a syntax error in your code, so check it and try again.

Success! The image above shows my extension among others I’ve installed.

Now check out all the code to my extension on GitHub, clone your own copy, and make your own find-and-replace extension. I used this to prank my editor, but the possibilities are endless! You could prank your family or friends, too. Or you could even—gasp—use the find-and-replace action to create something useful! 

In any case, I'd love to see what you build. Tell us all about it in comments.

Engineer Jack Lawson contributed to this article. 

Top photo by Darren Harve; all screenshots by Lauren Orsini

Categories: Technology

How Twitch And YouTube Are Making Video Games A Big Business

Fri, 2014-08-29 13:01

Mark this moment: Watching people play video games has become a big business, with Amazon, Google, Disney, and others vying for a piece of the action.

Call it livestream gaming: Top players record themselves playing popular titles and delivering commentary—or compete against each other in big, live, arena-style events. 

By turning video games from a solitary living-room obsession into shared events, livestream gaming is letting advertisers tap into a hard-to-reach demographic of mostly young men.

The Game Is Afoot

That means a huge influx of money into the video-game world: It's changing from a hardware-and-software business into a Hollywood-like media operation, complete with its own celebrities, agents, studios, and networks.

The big event in livestream gaming was Amazon.com's announcement that it will acquire Twitch, a site which specializes in livestream gaming videos, for a cool $970 million, after rumors that Google's YouTube might be interested in buying it, too.

Gaming as a spectator sport has already attracted audiences of millions of online users, most of whom watch it on gaming-dedicated YouTube channels or on Twitch. It's no longer an online subculture: For many teens, it is their mass media.  

And the wars to capitalize on livestream gaming and its personalities is underway. Maker Studios, a Disney-owned Web network that operates like a talent agency for popular YouTube stars, has just partnered with one of YouTube’s top gaming channels, The Diamond Minecart.

The Diamond Minecart joins two other Maker-represented gaming channels, PewDiePie and Stampylonghead. That means Maker Studios now has the top three most-subscribed gaming channels on YouTube. And it means Disney and Google have partnered up against Amazon and Twitch. It's on like Donkey Kong!

Gamers at Insomnia 52, the UK's biggest gaming festival.

Livestream gaming grew along with YouTube. Video-game streams were just one more genre of YouTube's early bedroom-webcam confessionals. What else would teens talk into the camera about?

But as passionate and engaged fan communities blossomed into subcriber bases that numbered in the millions, then big businesses began to take notice.

Swede IdeaPewDiePie trying Oculus Rift, a virtual-reality gadget.

While Twitch specializes in gaming, the topic is no stranger to YouTube. PewDiePie, YouTube's most-subscribed channel, is the online-video home of Felix Kjellberg, a 24-year-old Swede. He has 19 million subscribers, but he's just at the apex of a community of gamers on YouTube who garner massive fan followings by uploading videos of themselves playing games. 

Some advertisers are already tapping into their popularity.

Kjellberg recently agreed to appear in a promotion for Hollywood horror movie As Above, So Below. The film's marketers sent Kjellberg to Paris to record himself looking for missing keys within a haunted catacomb, complete with zombies and live cockroaches. It works particularly well because of the similarities to the horror-themed video games he often plays.

Video-game publisher Ubisoft partnered with popular YouTube comedy duo Smosh to create a song in 2012 for the release of Assassin's Creed 3. The accompanying music video now has a total of 54 million views.

In 2012, first person shooter video game Halo released Halo 4: Forward Unto Dawn, a science fiction action show, on YouTube channel Machinima Prime. The show has since been added onto Netflix’s roster of content.

Entering The Arena

Livestream gaming was born on the Internet—but it's jumping into the physical world. 

Where livestream gaming involves posting videos, esports—short for "electronic sports"—involves quasi-athletic video-game competitions, often staged in big venues before live audiences and, increasingly, broadcast on television.

League of Legends World Championships

In October 2013, video-game publisher Riot Games held its League of Legends World Championships at the Staples Center in Los Angeles. The arena was sold out, and 32 million gaming fans watched the competition online.

In July, ESPN, the Disney-owned cable network, broadcast The International, a tournament which featured an online battle-arena-style game called DOTA (Defense of the Ancients) 2, with an $11 million prize. 

League of Legends gamers.

So what can we expect for the future of livestream gaming? We’re already seeing Disney, Google, and Amazon getting into the mix, putting down eye-popping amounts of money into acquisitions and poaching top YouTube gamers.

Google and Disney's interests are obvious, since they are big sellers of advertising with a keen interest in teenage audiences. Amazon's interest in Twitch came as a surprise to many—but it, too, has an increasing interest in video games, thanks to its Kindle tablets and its in-house game studios, as well as in online advertising, where it hopes to challenge Google.

Others are likely to pile into the market now. The big winners may be anyone who loves video games. Heck, you don't even have to play them anymore. You can just lean back and watch.

Lead image by Madeleine Weiss, images courtesy of Riot Games, Tubefilter, PewDiePieThe Diamond Minecart, Flickr user artubr

Categories: Technology

Why Doesn't Your Government Run Like This?

Fri, 2014-08-29 12:02

Imagine your company got a 90% approval rating on its latest app. You'd be rich, right? Now imagine that your government got a 90% approval rating on anything, like passport approval or paying a parking violation or ... well, anything.

No, really. Stop laughing.

In the UK, a slew of changes to how technology is delivered has upended decades of dissatisfaction with government services. So much so, in fact, that one recent service upgrade netted a 90% approval rating from citizens.

What's going on?

Putting Citizens First

Mike Bracken (@MTBracken), executive director of the UK's Government Digital Service, is a man on a mission, and that mission is to revolutionize how the UK government serves its citizens. After joining the UK's Cabinet Office in July 2011, Bracken created the GDS and launched an all-out assault on IT incompetency.

Mike Bracken, executive directly of the UK Government Digital Service

His first big move was to change the definition of service. As he tells McKinsey & Co.:

Government around the world is pretty good at thinking about its own needs. Government puts its own needs first—they often put their political needs followed by the policy needs. The actual machine of government comes second. The third need then generally becomes the system needs, so the IT or whatever system’s driving it. And then out of those four, the user comes a poor fourth, really.

So Bracken "inverted that," creating a set of Design Principles that govern how the UK thinks about citizen services:

1. Start with needs (user needs, not government needs)
2. Do less
3. Design with data
4. Do the hard work to make it simple
5. Iterate. Then iterate again
6. Build for inclusion
7. Understand context
8. Build digital services, not websites
9. Be consistent, not uniform
10. Make things open: it makes things better

Such design principles would serve any organization well. Now imagine bumping into a government worker who has incorporated these principles into her DNA. Amazing. 

Opening And Sharing

One of the most impressive things I've seen Bracken and his team do, however, is to inculcate a culture of sharing through GDS. In a recent blog post, Bracken describes how two organizations ended up sharing code to solve a common problem:

Early on in the project, the Registered Traveller team found they needed software to help them work out what all the different possible visa documents are—a big task. Thankfully, the same problem had already been solved by another team in the Home Office working on another exemplar project. The Visit Visa team built a product catalogue to help make sense of it all, and shared their work on GitHub.... So even at that early stage, the Registered Traveller team was able to make use of the Visit Visa team’s code, saving themselves considerable time and effort and ensuring their own work didn’t get held up.

GDS isn't alone in turning to GitHub. As recently reported, more than 10,000 government users are active on GitHub today, with an ever growing number of code repositories residing on the popular service:

Credit: GitHub

But Bracken's UK team is pushing this sharing agenda much more actively than most, and to good effect. Even as President Obama's Healthcare.gov was melting down, the UK's gov.uk was scaling to meet user needs without a hitch.

Not because it was designed to scale perfectly from the start in a traditional waterfall development process—but rather because Bracken had infused the UK government with an agile development process that enabled them to iterate toward a scalable model. Gov.uk launched without major issues.

Why Can't Your Company Operate Like This?

Most governments struggle to function this efficiently. But then, so do most enterprises. The reason is that both tend to put user needs second to organization needs, and both struggle to share. 

On this last point, Bracken concludes, 

If one team shares code, another team benefits. Today it’s a couple of teams in the Home Office, tomorrow it might be a few more in other departments. Imagine how much better public services will be when this becomes the default behaviour across government. That’s what we’re aiming at.

Or imagine how much better your company would be. It all starts with the right principles.

Lead image by Michael D. Beckwith

Categories: Technology

ALS Charity Drops "Ice Bucket Challenge" Trademark Attempt

Fri, 2014-08-29 09:15
Actor Benedict Cumberbatch takes the Ice Bucket Challenge

The ALS Association's attempt to trademark the phrase "Ice Bucket Challenge" met with a chilly response from Internet critics. It has now withdrawn its applications with the United States Patent and Trademark Office.  

On Friday, the ALS Association announced that since July 29, it has received more than $109.9 million in donations—a 3504% increase from the $2.8 million it received during same time period last year. Those invited to participate in the in the Ice Bucket Challenge have 24 hours to either donate money or take a shower under a bucket of ice filled water, optimally posting video proof of the challenge met on the Internet. As the donations and more than 1 million videos posted on YouTube and Facebook show, many chose to do both. 

The origins of the ice bucket challenge are unclear, though its been utilized in benefit of other charities both before and during the ALS Ice Bucket Challenge's viral reign. The predecessor "Cold Water Challenge" that started making the rounds in mid 2013 was often issued to benefit cancer charities. 

The ALS Overreaches

Other charities dedicated to support of sufferers and research of ALS (amyotrophic lateral sclerosis, also known as Lou Gehrig's Disease), benefited from donations inspired by social media phenomenon. The ALS Association however, received from the lion's share, and on August 26, the nonprofit filed two trademark applications with the USPTO. The first was for "ICE BUCKET CHALLENGE" and the second for ALS ICE BUCKET.

"This is why we can't have nice things," was one of the more, ahem, charitable criticisms repeated on Twitter after trademark attorney Erik M. Pelton posted about the ALS Association's USPTO filings on his firm's website under the pithy headline, "Let the ice bucket trademark challenges begin." Many, however, didn't spare the organization their own icy criticism:

On Friday, the same day the ALS Association thanked donors all over the world for their $100.9 million generosity, it also pulled its trademark applications. The ALS Association provided ReadWrite with the following statement: 

The ALS Association filed for these trademarks in good faith as a measure to protect the Ice Bucket Challenge from misuse after consulting with the families who initiated the challenge this summer. However, we understand the public’s concern and are withdrawing the trademark applications. We appreciate the generosity and enthusiasm of everyone who has taken the challenge and donated to ALS charities.  

In his trademark blog post, Pelton delineated the reasons he felt the ALS Association's trademark attempt was inappropriate. These criticism included the fact that the phrase "Ice Bucket Challenge" isn't associated exclusively with fundraising for the ALS Association. He also questioned whether if successful, the nonprofit would prevent others from using the ice bucket challenge for other charitable causes.

But Let's Consign These Guys To An Icy Hell

On the other hand, perhaps we should have taken the ALS Association at its word. It's hard to escape that feeling when you see how many outfits are trying to profit from the viral phenomenon.  may very well wish to protect the the game that sent its donations skyrocketing from those who seek to profit from it personally. Look no further than the Ice Bucket Challenge Halloween costume for an example of why we really can't have nice things:

"ALS or Lou Gehrig's Disease is a serious illness that destroys nervous cells and lives," reads the description for this pile of plastic and tulle with a $40 price tag. "The Ice Bucket Challenge is the latest craze in "slacktivism" that helps raise awareness and donations to fund research in the ongoing struggle to end this terrible disease."  

Google Play has a host of ALS Ice Bucket Challenge-themed apps. Amazon offers an ever-increasing inventory of "I Survived The Ice Bucket Challenge" apparel for everyone from infants to adults. There are stickers and magnets too, including the "Lindsay Lohan Photo Fridge Magnet, ALS Ice Bucket Challenge" ($2.99), which originally promised "side boob" but now merely assures the potential purchaser of "Highest Quality Magnet Available."

None of these product descriptions offer much indication that your purchase will benefit anyone but the profiteers offering them. It's enough—almost—to make you ask the ALS Association to reinstate its application. 

Lead image of Benedict Cumberbatch taking the Ice Bucket Challenge via YouTube screencap

Categories: Technology

Apple Might Tap Old Tech For Mobile Payments

Fri, 2014-08-29 08:07

I rarely buy things in stores with cash anymore. Between mobile apps and credit cards, there’s very little need.

Apple reportedly wants to streamline that experience even further—by using a technology many had given up for dead.  

Wired's sources tell it that the iPhone maker is putting NFC wireless chips in its next new smartphone, presumably to allow for mobile payments. The Financial Times says that Apple is working with NXP, a Dutch chipmaker which already supplies some components to Apple.

If that's true, iPhone users would be able to tap their devices on terminals in stores, on buses and other places to pay for things, check in or transmit information. 

This once sounded like an intriguing concept … back in 2007. That’s when the first cell phone—the Nokia 6131—came out boasting NFC. Since then, other devices have supported it, such as LG, Samsung and Google Nexus phones. But Apple has been a holdout, a factor which many have pointed to as crippling NFC acceptance. Retailers weren't interested in a technology which only a small set of phones had installed, and consumers weren't interested in a newfangled payment method which wasn't widely accepted and was harder than just swiping a plastic card.

The question now is not whether Apple has the juice to blow the dust off this technology. The real question is why now.

The Rise And Fall Of NFC

NFC, short for near-field communications, is a variation of RFID, a technology used in keycards, transit passes, and other devices. While RFID devices varied in range and had security problems, NFC improved on older standards by requiring a physical tap and specifying better protection for data.

NFC can run without power, so a dead phone battery wouldn’t have you panhandling for lunch money. (Most systems require a live Internet connection for payments over a certain amount, though.) Its mix of hardware and software can even separate sensitive data like credit-card detailsfrom the rest of your phone in case the device is hacked. 

For years, that exciting premise fueled rumors that Apple would put NFC in the iPhone. That never materialized, even as competitors put it in some Android and Windows devices. 

Consumers and tech innovators gave NFC the cold shoulder—too complicated, too subject to carriers' controls. Square and PayPal bypassed it altogether, using other ways to securely identify customers. Google Wallet's NFC payment option has been all but dead on arrival, killed by a lack of support from carriers. An industry consortium, the unfortunately named Isis, has had similar problems. 

And retailers like Best Buy and 7-Eleven have reportedly ditched their initial support for NFC, switching to simpler mobile-payment technologies like barcode scanning—a solution Starbucks, Square, and PayPal have all embraced as compatible with existing cash registers and familiar to consumers and cashiers.

Too Late Or Just In Time?

Meanwhile, Apple has been quietly keeping an eye on NFC. The company has filed related patents, one of which was discovered as recently as last January

After all this time, Apple CEO Tim Cook and his crew might be finally ready to pursue tap-to-pay as another iPhone payment option. And this timing makes some sense, when you consider what the company has been up to lately. It launched Passbook, the location-aware app that slaps coupons and other things on screen when you’re near a store; Bluetooth-based iBeacons, for in-store customer tracking; and Touch ID, a fingerprint scanner, in the iPhone 5S. 

Apple also loves to mention its hundreds of millions of credit cards on file in the iTunes store—though processing its own credit-card transactions at retail are a far different beast than processing transactions for other retailers, where all kinds of complications arise from fraud, disputes, and payment glitches.

Even so, several pieces in Apple’s mobile payments jigsaw puzzle are in place, ready to lock together in a system that knows when you’re approaching a store, can send a promotion to you based on the aisle you’re in, and authenticate you when it's time to purchase. All it needs now is a way to conduct the actual transaction.

With NFC, that would mean bumping an iPhone to an NFC-ready terminal to transmit payment details. 

Although Apple doesn't even use it in its own stores— people can pay directly for accessories and other gear from the Apple Store mobile app and walk out with the item, no waiting required—it offers a nice alternative for retailers not ready to ditch the time-honored transaction ritual. 

Striking While The Iron Is Hot

There is no doubt that people have become accustomed to shopping with their phones. 

But most of that so-called "m-commerce" is people engaging in conventional e-commerce, using phones or tablets instead of laptops. Players like PayPal cite big numbers for their mobile transactions, deliberately conflating in-store retail payments with e-commerce that's just shifting to different devices.

And it's far from clear that retail payments are broken. Most consumers seem perfectly happy swiping plastic cards. PayPal and Square have put an emphasis on ordering ahead with their mobile apps, a scenario where it makes sense to bypass the conventional card swipe.

Where phone-based mobile payments may matter most, though, are markets that are far from home for Apple.

Apple cares deeply about making inroads in China. Smartphone users there are voracious mobile shoppers. Chinese tech site TechNode reported figures from the People’s Bank of China, revealing that mobile payments in the country amounted to nearly $1.6 trillion in transactions last year. 

Granted, this figure covers online purchases, in-store payments using mobiles, and person-to-person money transfers—so while it's an eye-popping figure, it doesn't really correlate with the potential market for NFC.

Other reports indicate NFC transactions account for a small slice of phone-based shopping in China. The reason: There just aren't many opportunities to pay with a phone tap. Apple could change that by galvanizing the market and persuading retailers to take a risk on NFC.

Ultimately NFC may matter for more than just payments. Apart from shopping, NFC could also deliver convenience features in smart homes, cars and health devices, which are all areas that Apple's already addressing. 

Apple has a knack for taking ho-hum technologies and turning them into hot, in-demand products. That’s sort of its specialty. The signs indicate it's finally ready to try its Midas touch on NFC. It won't be long until we know for sure. 

Lead photo by Incase

Categories: Technology

Our Growing Obsession With Screens Is Killing Our Brains

Thu, 2014-08-28 17:47

Paperback readers, rejoice! It turns out reading books on e-readers doesn’t give your brain the same experience as those paper books gathering dust on your bookshelf.

A new study by European researchers found that recollection of plot points and story lines were “significantly” worse for readers who read on a Kindle versus a paperback book, the Guardian reports.

Fifty readers were given a mystery book to read; half were given a Kindle, and the other half, a paperback. The researchers, led by Anne Mangen from the University of Stavanger in Norway, discovered that reading on a Kindle prevented readers from comprehending the literature as well as reading it on paper.

Mangen presented the work at a conference in Italy in July, the Guardian reported, and the study will be published as a paper.

“The Kindle readers performed significantly worse on the plot reconstruction measure, i.e, when they were asked to place 14 events in the correct order," Mangen told the newspaper.

Does Paper Or PDF Really Matter?

The Kindle versus paperback study is a very small sample and thus prone to all sorts of uncertainties, but it falls in line with a growing body of evidence on reading comprehension of digital material.

Last year, Mangen analyzed reading comprehension of 72 10th graders. She broke the students into two groups: One group received information on paper and the other read PDFs on a computer screen. She found that students who read on the computer screens comprehended less than their paper-reading peers, and students who read digital copies of information forgot it much quicker.

It’s not just reading that could be suffering, but writing too. As handwriting and cursive notebooks are replaced by iPads and laptops, educational development in students who are just beginning to read and write creatively could be negatively affected.

Psychologists and researchers are studying why handwriting is so important, especially in child development. One study found that when a child drew a letter on paper, activity in their brains increased in the same three areas of the brain that are activated when adults read and write, the New York Times reports. That activity did not happen when the child typed the letter.

Our brain must understand that each possible iteration of, say, an “a” is the same, no matter how we see it written. Being able to decipher the messiness of each “a” may be more helpful in establishing that eventual representation than seeing the same result repeatedly.

The importance of handwriting can follow students through college, too. When students take notes with laptops, they perform worse on conceptual questions than students who take notes longhand, by writing them down in a notebook, according to a 2014 study.

Don't throw out your pulp fiction and notebooks just yet—they aren't going anywhere anytime soon. If people begin to rely solely on computer screens for reading and writing as our books are replaced by digital copies, comprehension may suffer.

Then again, forgetting a book might not be such a bad thing, considering you can read it, and enjoy it, all over again.

Lead image by Selena Larson for ReadWrite

Categories: Technology

Apple Confirms: It's Holding An Event On September 9

Thu, 2014-08-28 16:53

Apple confirmed that it will be holding a special event on September 9 in its hometown of Cupertino, Calif. 

The company's coy teaser, "Wish we could say more," leaves us wondering what we can expect here. If rumors hold true, we'll see new iPhones (which may include NFC mobile payments), iOS updates and even the long-awaited and hotly-debated Apple wearable device. 

Categories: Technology

Ahead Of The iWatch, Apple Has Given HealthKit A Workout

Thu, 2014-08-28 13:02

ReadWriteBody is an ongoing series where ReadWrite covers networked fitness and the quantified self.

When Apple first unveiled HealthKit, its software for weaving together fitness apps, wearables, and iPhones, there were more breathless exclamations than you'd find in your average SoulCycle workout.

I was quick to deflate the hype around HealthKit, pointing out that the skimpy specs and documentation Apple released this summer showed that it was geared around fairly specific medical tasks, like recording a body-temperature reading from your armpit—not the needs of fitness-app makers.

Since then, Apple executives have been busy taking meetings with fitness-focused developers, and the most recent beta release of iOS 8, the next version of its iPhone software, suggests that HealthKit could actually be a helpful workout buddy.

Crucially, Apple introduced new data types and functions in HealthKit for logging workouts earlier this month that didn't exist in its first release. HealthKit apps can now log workout type, duration, and calories burned.

Even medically-oriented apps can capture more meaningful data now. Samir Damani, a cardiologist and CEO of MD Revolution, a fitness-app maker, pointed out to me that the first version of HealthKit didn't capture data like a user's body-fat percentage, which clinical studies have shown is a better predictor of fitness than weight or body-mass index. In its latest update, HealthKit now includes body-fat statistics as a metric.

In this regard, Apple is playing catch-up with Google, whose Google Fit software tools launched with far more developed workout-related features, and a better structure for adding new data types based on Android.

The latest reports suggest that Apple will announce a wearable device, the so-called "iWatch," in September—earlier than previously expected. The only clue to its function in Apple's documentation is HealthKit's ability to record heart-rate data from a wrist-based device.

HealthKit also records sleep-quality information, distinguishing whether you're in bed or actually asleep—data that devices like the Jawbone Up, Fitbit, and Runtastic Orbit can collect.

The question, though, is what Apple will do with all this data, whether it comes off an iWatch or other wearables that interact with HealthKit.

"It's not in Apple's DNA to interpret data," says MD Revolution's Damani. That will be up to HealthKit-ready apps.

It's a healthy sign, though, that Apple is making such rapid progress in fleshing out HealthKit, and going beyond just vital signs that a nurse might collect in a doctor's office to the real signals of a healthy body that we produce when we work out.

Photo via Shutterstock

Categories: Technology

Samsung's New Gear S Smartwatch Looks Great

Thu, 2014-08-28 10:47

On the heels of this year's Gear 2, Gear Neo and Gear Live, Samsung just announced Thursday another new smartwatch. And this one's a looker. But is it a doer? That's another question entirely.

The Gear S, announced Thursday, features a curved, two-inch AMOLED screen that should offer beautiful visuals (and does, at least in Samsung's marketing photos). And because the screen isn't just another flat slab of glass, it lends the whole watch an uncharacteristically fashionable appearance—one similar to those wide cuff-style bracelets intended for both women and men.

In theory, the Gear S backs up its handsome appearance with both brains and brawn. Unlike most modern smartwatches, it can connect directly to the Internet; users can also make and receive phone calls in the absence of a smartphone. Like other smartwatches, it can also pair to a smartphone in order to run apps and receive notifications.

You can see why Samsung would want to go this route at this time. Apple could introduce its long-rumored iWatch (or whatever) as soon as September 9, so Samsung's move looks very much like an attempt to steal some thunder from its design-obsessed rival.

Which presents a big risk. Because Samsung has loaded this watch down with so much gadgetry—not to mention unfamiliar and potentially confusing software—that it risks turning off the very fashionistas it's apparently trying to court.

What The Gear S Does

In some ways, the Gear S isn’t all that different from its rivals in the wearable-technology market. 

Gear S in black

It can pair to Samsung's Galaxy line of smartphones in order to funnel in text and email notifications and extra features, courtesy of mobile apps designed to work with the device. It also offers fitness features, like heart-rate monitoring and step tracking. 

The Gear S is accompanied by an accessory called the Gear Circle, a nifty set of Bluetooth headphones with a magnetic clasp. Together, the pair of gadgets allows users to receive or place calls without ever whipping out their phones.

The Gear S also comes with a 3G radio and built-in GPS, as well as Bluetooth and Wi-Fi. But that wireless independence comes at a cost—in this case, a heavy drain on the watch's smallish 300 mAh battery. For comparison, that's less than one-eighth the size of the battery in the Galaxy S5. It's also a smaller battery than those in rival watches such as the upcoming LG G Watch R and the Moto 360, whose power cells top 400 mAh.

Now, maybe Samsung has some magnificent power-management technology up its sleeve. Otherwise, it's going to be very interesting to see what kind of battery life the Gear S can deliver.

What The Gear S Looks Like

At least those internals come housed in a good-looking case with a beautiful, curved display. 

The company likely registered the positive feedback from the curved screen on its Gear Fit fitness tracker. Plus, it allowed Samsung to indulge its penchant for enormous screens—a giant flat slab on an arm would be problematic for numerous reasons, comfort being a big one. 

Gear S in white

The Gear S has a 2-inch curved "Super" AMOLED display, which should offer visuals in a generous size for something wristworn. It's a big screen even for Samsung; the company’s earlier Gear 2 showed up with a 1.83-inch screen, while last summer's Gear Live featured an even smaller 1.63-inch display.

Of course, that AMOLED display will also likely hit the battery hard. Samsung says, under normal use (whatever that is), the watch should deliver 2 days of battery life. We're a little skeptical—unless "normal" means shutting down everything that makes this watch unique. But we'll reserve judgment until we get our actual hands on it. 

What Makes The Gear S Run

Speaking of software, Samsung seems to be ping-ponging between operating systems, hedging its bets between Android and its home-grown Tizen software. For the Gear S, it's gone with Tizen, software that analysts and Samsung rivals frequently disparage as "dead in the water" or as having "no chance to be successful."

Samsung would obviously like to pull away from Android and its dependence on Google. But its on-again, off-again affair with Tizen is doing consumers no favors. 

Samsung Gear Circle

It went with Android for the first Gear, threw it over for Tizen on its follow-up Gear 2 and Gear Neo, and then hopped on the Android Wear bandwagon in June with the Gear Live. Now it’s bouncing back to Tizen.

Samsung plans to present its new wearable at the IFA show in Berlin next week, where it will announce global availability starting at the beginning of October. 

Photos courtesy of Samsung.

Categories: Technology

Dropbox Antes Up In The Cloud-Storage Price War

Wed, 2014-08-27 18:13

Cloud storage platform Dropbox announced today that it is offering a terabyte of storage for $9.99 a month, the same terabyte storage price set by competitors Google Drive and Microsoft’s One Drive.

This is just the latest move in the great cloud computing war of 2014, as these three heavy hitters engage in a race to the bottom for the consumer’s storage gigabyte. In March, Google announced a terabyte for $10 a month and 10 terabytes or more for $100 a month

Apple, meanwhile, plans to release its iCloud Drive later this year, which will offer 5GB for free, 20GB for a dollar a month or 200GB for $4 a month.

Images courtesy of Dropbox

Categories: Technology

The Truth About DevOps: "IT Isn't Dead; It's Not Even Dying"

Wed, 2014-08-27 17:37

With the rise of the modern developer, traditional IT operations is dead, right? 

Not even close, says Puppet Labs founder Luke Kanies. And he should know. Puppet is one of the technologies at the heart of the DevOps phenomenon, a merger of traditional IT and app development in which developers take on more responsibility for running the infrastructure that hosts their apps—in part, by automating it.

See also: DevOps: The Future Of DIY IT?

Yet Kanies believes that while there are "plenty of environments that can survive without ops for a while," they almost certainly don't look like the enterprise you work for today or want to work for tomorrow. As Kanies points out, even developer-heavy organizations like Google still depend upon operations:

To get more insight into the shifting landscape of IT operations, I sat down with Kanies to learn more about how operations is changing, and where it's going.

Long Live OperationsLuke Kanies, founder of Puppet Labs

ReadWrite: Developers are the new kingmakers. What does this mean for operations? Are they doomed?

Luke Kanies: I don't buy it. Developers have been the kingmakers since the 90s, and yeah, things are different, but to say that operations won't have a significant role in the future just doesn't make sense. 

The reason why operations is feeling pressure now is that they have a real competitor for the first time—if IT says no enough times, the developers just work around them, which is how you get shadow IT. That competitive pressure won't result in IT going away, though, it'll result in IT stepping up.

Or at least, that's what will happen in most cases, eventually. 

Sure, there are plenty of environments that can survive without ops for a while, like all-developer startups with no users and no production, and there are environments where IT just can't get it together, like very outdated enterprises where the only way to drive change is to essentially give up. In most cases, though, the majority of IT will shift, change, and evolve.

You're already seeing outsourcing move from the technology to the service layer, which is working really, but also doesn't make IT go away. Salesforce and Dropbox both still have sysadmins; I know, because they both use Puppet. However, your using them means your own IT load might get a bit smaller.

Specialize, Specialize, Specialize

RW: You seem to think there's a very long shelf life left for Ops. Why? What is essential about the role Ops plays?

LK: You can move operations around—outsource to Savvis, outsource at the service layer to a SaaS provider, abstract via a PaaS—but fundamentally, there are different skills involved in making infrastructure work and in running applications than there are in building them. It's fundamentally about specialization, and specialization gets deeper over time, not shallower.

RW: How is Ops changing? What tools do they need to know?

LK: It used to be that a sysadmin had to know every piece of technology, but somewhat shallowly. Now, with SaaS, and the (slow) transition from vertical network/storage/system/etc silos into horizontal abstractions across the whole datacenter, we're finding that sysadmins can go deep on a few topics.

I think the best analogy is from the industrial revolution: Going into it, every farmer made nails, clothes, thread, furniture, houses, and everything else. As excess capacity was created through productivity, people began to specialize and sell the results of that excess capacity. 

So, if you were particularly good at making nails, you could focus on that, and sell the extra. The reason there was someone to sell the extra nails to is that someone else was good at making thread, or cloth, and they could sell that.

We're seeing something similar resulting from SaaS. Modern startups get to focus almost exclusively on a couple of applications, and they outsource everything else to service providers. 

At Puppet Labs, basically all of our infrastructure is focused on our customers; we have SaaS providers for email, file-serving, trouble tickets, and essentially everything else. This allows us to specialize, so our sysadmins (and developers) go deep, and it allows our providers to similarly specialize.

Thus, it's a long transition from an inch deep and a mile wide to a mile deep and an inch wide.

Technology As A Core Competency—For Everyone

RW: Look 5 years out. What does development look like then? How about Ops?

LK: I'm not a huge fan of prognostication, because I hate being wrong, and I'm not convinced you can sufficiently reliably predict to avoid the egg on the face. 

That being said, it's pretty easy to believe that in the future we'll have more servers providing more services that evolve more quickly, and they'll be more important to us. Mostly that just means that technology will have to be a core competency of essentially everyone, and none of the stuff we're doing now is anywhere near done. 

If you think you're missing the cool stuff, don't worry. Everything that comes after is even more interesting than what's happening right now.

RW: With that in mind, if you were giving advice to someone graduating from college and interested in enterprise IT, what would you tell them? How should they build a career in IT?

LK: Wow. I wouldn't really give fair answers to people, because I didn't travel a path that others would. [Ed. note: He's not kidding. Kanies grew up on a hippie commune in Tennessee.] I have a degree in Chemistry from a liberal arts college [Reed College], and I got fired a *lot* in my path to becoming a sysadmin.

Given that, though, my advice is to become uniquely valuable. 

That is, become great at something that the market finds useful, and spend the majority of your time finding a way to be more valuable than the people around you. Don't travel the well-trodden path, but also, don't spend too much time bouncing around. Practice, learn, spend hours and hours improving.

You need to spend the next 5 years getting what amounts to a PhD. in whatever you're working on. What books do you have to read, what jobs do you have to take, what risks do you have to run, to learn what you'll need to base your career on? Answer those questions, and you'll be OK.

RW: How do companies use Puppet? How is this different from the old world of Ops?

LK: What's different is that it's faster, and more fun. The biggest pressure our customers feel is to move more quickly, both in making decisions, and in carrying out the resulting work. Puppet helps them stop firefighting and doing trivial repetitive work, and spend more time doing the interesting strategic work that involves getting more of your software out to your customers more quickly.

This makes the sysadmins happier and more fulfilled, because they like their jobs better and they're more relevant to their companies and customers, and it ties the work of IT to the strategic priorities of the company, which means the rest of the company is much happier and more aligned.

Lead image courtesy of Puppet Labs

Categories: Technology

Why We Self-Censor On Social Media

Wed, 2014-08-27 17:05

Have you ever caught yourself deleting a Facebook status update even before you post it, because you anticipate the negative backlash it will receive from your online friends?

I know I have, and according to a study by Pew Research, you probably have, too.

Twitter and Facebook can be useful tools for distributing news and information, but when it comes to sharing controversial news, America can be reluctant to share things on social media if they know friends might disagree with it.

To study the effect social media has on debate and discourse of polarizing public issues, researchers surveyed 1,801 American adults about the Edward Snowden NSA privacy revelations in 2013, which exposed the government’s massive data collection through surveillance of American’s phone and email records.

Just 42% of Facebook and Twitter users were willing to post about the NSA surveillance program, compared to 86% of people who were willing to have a conversation in-person. The study also found that people were more willing to talk about the topic with people who agreed with them, both online and off. 

The study also found that social media users are more hesitant than others to share their dissenting views, online or off. 

The average Facebook user (someone who uses the site a few times per day) was half as likely as other people to say they would be willing to voice their opinion with friends at a restaurant. If they felt that their online Facebook network agreed with their views on this issue, their willingness to speak out in a face-to-face discussion with friends was higher.

According to Pew, a “spiral of silence,” a term that was introduced long before social media, can be applied to both online and offline opinions and debate. This means that generally one dominating viewpoint overtakes the conversation because those with minority viewpoints are less likely to speak up.

American politics are more divided than any time in recent history, and that could be the result of people continuing to surround themselves, and have discussions with, people who are sympathetic to their opinions. With social media, it is easier to self-censor and share updates you know will get a lot of Likes, rather than opinions that could spark a negative debate in the comments. 

The old adage, "Don't talk politics or religion at the dinner table," apparently applies to social media, too.

Lead image by bryan 

Categories: Technology