Leaks point to two 2016 Moto X phones with modular add-ons

Motorola was in dire straits when it was acquired by Google back in 2012, as its phones failed to compete with the Galaxies and iPhones of the world. Google oversaw the launch of the original Moto X, which many feel was a turning point for the company. Now,Motorola is owned by Lenovo and it’s nearly time for a refresh of the Moto X. Multiple leaks are pointing to a radical redesign focused on a system of modular accessories, because apparently modular phones are the gimmick of 2016.

In the first of the recent Moto X leaks, we saw a device that looked very different than the last few Motorola flagships. The back panel was completely flat and made of metal, rather than the curved plastic, wood, and leather of current phones. The trademark Motorola dimple was also gone, much to my personal dismay. That small depression serves as an excellent place for your finger thanks to the ergonomics of “internal stabilization.” That’s a big help with hand-stretching phablets.

On the bottom of this rear panel we see 16 electrical contacts, and sources now say these are for connecting modular accessories called Amps. If that sounds familiar, that’s because Motorola isn’t the only OEM to have this idea in 2016. LG did something similar with the G5. On that phone, the bottom of the phone comes completely off and can be replaced with a camera grip or Hi-Fi audio module. However, you need to shut the phone off, swap the battery, and wait for it to boot back up to swap G5 modules. Those two modules aren’t even very compelling (and the Hi-Fi isn’t available in the US). Motorola’s system has the potential to be much more friendly.

modular

According to the leaks, there will be six different attachments for the new Moto X, including a simple stylized cover that comes with the phone, a dedicated camera module with optical zoom, a pico projector, a battery pack, stereo speakers, and a wide-angle camera lens with rugged case. You’ll be able to simply snap these on the rear of the device without a full reboot. The Amps will snap onto the back of the phone, going all the way from top to bottom. They’ll probably be held in place with magnets.

Pricing will be key. LG wants $70 for the camera grip and nearly $200 for the Hi-Fi. I have a hard time believing people are going to buy very many $100-200 accessories for their phone.

We also have it on good authority now that there will be two Moto X phones this year, both compatible with the same modules. The Moto X Vector Thin (above right) will be the flagship with a 5.5-inch 1440p AMOLED, Snapdragon 820, and 4GB of RAM. The Moto X Vertex (above left) will have a 5.5-inch 1080p AMOLED, Snapdragon 625, and 3GB of RAM. Oddly, the Vector Thin will be so thin (5.5mm) that it’ll only have a 2600mAh battery (the current Moto X is 3000mAh). The Vertex will be a little thicker (7mm) with a 3500mAh battery. Both phones have a fingerprint sensor on the front below the screen as well.

But wait, there’s that modular capability. Even though the Moto X Vector Thin allegedly has a pretty small battery, you can just buy that battery Amp and attach it. Oh, and Motorola is ditching the stereo speakers on these phones, but you can get the stereo speaker Amp. Frankly, this is starting to sound like DLC for your smartphone. It strikes me as very strange that Motorola/Lenovo might be artificially limiting a phone in order to sell you modules.

Lenovo is set to host its Tech World event on June 9th where it will show off its new Project Tango phone, as well as something from Motorola that will “transform mobile in a snap.” That sounds like a modular phone announcement.

Doubts cast over whether teenager located lost Maya city

Fundamental breakthroughs in archaeology are a comparatively rare event. For everyKing Tut’s tomb or discovery of Troy, there’s thousands of hours of painstaking labor, whether that means tromping through jungles, excavating fragments of pottery, or poring over ancient manuscripts preserved by volcanic eruptions. The Internet is currently erupting with stories of William Gadoury, a 15-year old Canadian who may have located an ancient Mayan city by mapping Maya constellations against other, known Mayan cities.

The initial story is that Gadoury managed to map 117 cities to major stars in 22 constellations. When he included a 23rd constellation, he found that two of its stars matched already mapped locations, but a third star was unmatched. He predicted the potential location based on the other two, asked for existing satellite photography of the area, and found something that looked obviously man-made.

KaakChi

The potential site, named “K’aak Chi by Gadoury

It’s been reported that this claim represents an 86-meter pyramid and up to 30 structures around it. Actual experts, however, are more than a little skeptical of claims that Gadoury located a major Maya city, instead identifying it as a cornfield, or milpa.

Stuart-Debunk

Stuart notes in separate comments that he’s glad to see Gadoury taking a commendable interest in Mesoamerican archaeology, but that some of the experts jumping on the bandwagon to drive attention to their own work really ought to have known better. When he calls this a Rorschach process, he’s referring to Rorschach inkblots, which don’t actually represent anything but can tell you something about the mindset of the person who looks at them.

There are potential issues with the claim that cities can be mapped cleanly to constellation maps. What we call the Maya civilization existed for thousands of years. Mesoamerica is one of the six identified “cradles” of civilization; the identifiable “Maya” empire stretches from 2000 BC to 1697 when the last Maya city-state fell to the Spanish. Just as the Roman Empire expanded, contracted, and ultimately transformed into the Byzantine Empire before falling to the Ottoman Empire in 1453, the Maya civilization went through profound transformations across thousands of years. Many sites were abandoned after the 1st century AD and new cities and complex relationships arose to replace them.

Human beings have built large-scale structures to mark cosmological events and observe the heavens for centuries, and the Maya were no exception. Past analysis has proven that Maya charts and codices on planetary movements were, in fact, far more advanced than what the Europeans knew at the time of first contact. There’s no evidence, however, that the Maya built entire cities based on their astronomical observations and plenty of reason to think they didn’t. Humans build monuments and temples to mark their observations of the stars; we build cities in areas where there are useful resources or opportunities for trade. To create an entire network of cities across hundreds of miles of challenging geography would be a monumental endeavor, particularly since the indigenous cultures of Mesoamerica didn’t have domestic equivalents of the horse or cow.

Stories like this grab the human imagination precisely because every now and then, wedo find an ancient city or vanished civilization — particularly in Mesoamerica and South America, where the jungle can hide the telltale signs of human habitation with amazing speed. Sites like Vilcabamba and Machu Picchu were both lost for centuries before being rediscovered. Most of the time, however, archaeologists find just another corn field — and until a team of on-site archaeologists can verify the existence of a massive pyramid and 30 stone structures, we’re going to side with the experts on this one.

Overwatch beta runs well on consoles and PC alike

Blizzard has a long history of producing high quality multiplayer games, and Overwatch isn’t going to buck that trend. Last week’s open beta gave us a representative look at how the retail version of the game will be when it launches on the 24th, and it seems to work as intended regardless of which platform you plan on using.

For the better part of a decade, Blizzard was working on developing an MMO dubbed “Titan.” It was eventually canceled in 2014, but some assets from that project still live on in Blizzard’s new first-person shooter. And it’s not just Blizzard’s legacy that Overwatch is building on. In many ways, it also takes inspiration from Valve’s Team Fortress 2. It’s a team-based shooter focusing on wildly different classes, and the lively personalities of the characters evoke very warm feelings for many of us with hundreds of hours of TF2 under our belts.

From a technical perspective, Overwatch seems to be well-built. Both the Xbox One and PS4 versions feature dynamic resolution scaling, but they’re both sitting at 1080p most of the time. Based on the comparison done by Digital Foundry, the Xbox One appears todrop below 1080p more frequently, but it’s not a widespread issue. You’ll see a slightly fuzzier image for short stretches, but that’s certainly preferable to being stuck at 720p or 900p at all times.

The game targets 60fps on consoles, and it usually delivers. The PS4 version rarely budges from 60, and when it does, it seems to happen most often when you’re watching the kill cam. Over on the Xbox One, drops are a bit more frequent when you’re actually playing. Unfortunately, you’re going to get some screen-tearing, but it’s not enough to ruin the experience.

If you have the option to play Overwatch on the PC, that’s going to be your best bet. It comes as no surprise that you’ll be able to outpace both consoles in the frame rate and resolution department. On top of that, the draw-distance, lighting, and anisotropic filtering are all improved on the PC side as well. The differences are still pretty minor though. If you only have access to a gaming console, you’re not missing out on all that much here.

If you didn’t get the opportunity to play the beta yourself, you can get a good look at Overwatch in action in the video above. Our sister site IGN played through about an hour of the open beta live on the internet, and two full hours during the closed beta.

I spent some time with the PS4 version over the weekend, and didn’t see any notable issues with the frame rate or network performance. As for the gameplay itself, I had zero trouble jumping in headfirst. If you’ve ever played an FPS before, you’ll pick up the basics quickly. Even so, there are 21 characters spread across four classes, and they all play differently. It’ll take some time to become familiar with the quirks of each “hero.”

While the developer has said that this most recent beta is exactly what we’ll see at launch feature-wise, it’s important to remember that all online games are susceptible to network issues. Even though Blizzard has an absurd amount of experience keeping servers up and running under heavy load, it’s still possible that the launch could be plagued with downtime. If that’s a concern for you, consider waiting until after launch to buy in.

New Windows 10 build kills controversial password-sharing Wi-Fi Sense

When Microsoft announced Windows 10, it added a feature called Wi-Fi Sense that had previously debuted on the Windows Phone operating system. Wi-Fi Sense was a password-sharing option that allowed you to share Wi-Fi passwords with your friends and contacts in Skype, Outlook, and Facebook. Here’s how Microsoft described the featurelast year:

“When you share Wi-Fi network access with Facebook friends, Outlook.com contacts, or Skype contacts, they’ll be connected to the password-protected Wi-Fi networks that you choose to share and get Internet access when they’re in range of the networks (if they use Wi-Fi Sense). Likewise, you’ll be connected to Wi-Fi networks that they share for Internet access too. Remember, you don’t get to see Wi-Fi network passwords, and you both get Internet access only. They won’t have access to other computers, devices, or files stored on your home network, and you won’t have access to these things on their network.”

WiFiSense2

There were security concerns related to Windows 10’s management of passwords and whether or not said passwords could be intercepted on the fly. To our knowledge, no security breaches or problems were associated with Wi-Fi Sense. According to Microsoft, few people actually used the feature and some were actively turning it off. “The cost of updating the code to keep this feature working combined with low usage and low demand made this not worth further investment,” said Gabe Aul, Microsoft’s Windows Insider czar.

These changes are incorporated into the latest build of Windows, Windows 10 Insider Preview 14342. Other changes in this build include:

  • Microsoft Edge extensions are now downloaded from the Windows Store (Adblock and Adblock Plus are now available for download);
  • Swipe gestures are now supported in Microsoft Edge;
  • Bash on Ubuntu on Windows now supports symlinks (symbolic links);
  • Certain websites can now be directed to open in apps instead, ensuring that one of the mobile Internet’s worst features will be available in Windows 10.

Microsoft has also fixed playback errors with DRM-protected content from Groove Music, Microsoft Movies & TV, Netflix, Amazon Instant Video, and Hulu. The company fixed audio crashes for users who play audio to a receiver using S/PDIF or HDMI while using Dolby Digital Live or DTS Connect, and fixed some bugs that prevented common keyboard commands like Ctrl-C, Ctrl-V, or Alt-Space from working in Windows 10 apps. Full details on the changes and improvements to the build can be found here.

One final note:  Earlier this year, we theorized that Microsoft might extend the free upgrade period longer than the July 29 cutoff, especially if it was serious about hitting its 1 billion user target. The company has since indicated that it has no plans to continue offering Windows 10 for free after July 29. If you want to upgrade to Windows 10 or are still on the fence about whether or not to accept Microsoft’s offer, you only have a little over two months to make the decision.

Kepler team announces largest-ever collection of 1,284 verified new exoplanets

NASA announced that its Kepler telescope has verified 1,284 exoplanets in our galaxy, more than doubling the number of known planets orbiting other stars. An additional 1,327 candidates have better-than-even odds of being actual planets, but they aren’t well enough substantiated to be called “verified,” and will require additional study. What’s more, Kepler reports nine confirmed exoplanets of Earth-like size in their star’s Goldilocks zone.

At the core of Kepler’s ability to say it has found planets lie its methods of prediction and verification. Kepler finds planets by watching for stellar dimming: When a planet passes between its star and an observer, the star gets a little dimmer for a little while. This works fine as long as you can find planets that pass directly between their parent stars and our telescopes, but it’s failed in the past, because planets aren’t the only things that can dim a star’s light. Binary stars or smaller, dimmer stars called brown dwarfs can also make for a point of light that fluctuates regularly in brightness. These impostors have led to someastronomical disappointments.

But NASA announced at the press conference that a new method of analysis, developed by Tim Morton, can weed out dud planets from the candidate list en masse, without needing case-by-case confirmation from ground-based telescopes. Morton’s method accomplishes this by running Kepler’s list of candidates through a statistical filter based on how common impostors are in our galaxy. Then the new method compares the star’s brightness and dimming to an ideal model of a star and planet with an “edge-on” orbit. This adds up to unprecedented accuracy in predicting whether a given star is really being eclipsed by a planet.

In keeping with their theme of “orange is the new blue,” NASA showed the new Kepler findings in orange in these images from the briefing, and older Kepler discoveries or observations from other telescopes are shown in shades of blue.

The histogram shows the number of planet discoveries by year for more than the past two decades of the exoplanet search. The blue bar shows previous non-Kepler planet discoveries, the light blue bar shows previous Kepler planet discoveries, the orange bar displays the 1,284 new validated planets. Image and caption by NASA
<>
<>
123456

Kepler was conceived from the beginning as a mission for finding Earth-like planets elsewhere in the galaxy, and its original mission was a success, returning conclusive evidence for almost a thousand exoplanets within the Milky Way galaxy. But no plan survives contact with the enemy, and in this case, the enemy is the cold, hard vacuum of space. Kepler has been plagued by problems in its reaction wheels, leaving it with a mechanical Charley horse that prevents it from looking smoothly around in the sky. And apparently even NASA sometimes has to ask “Have you turned it off and back on again?” But Kepler just won’t quit. So the ground team decided, collectively, to do it live. They’re applying these new methods of analysis and confirmation to wring as much science as they can out of the telescope before its fuel runs out, sometime about two years hence.

When Kepler finally does run out of fuel, it’ll pass the baton to TESS and the James Webb Space Telescope, which will watch the skies in the visible and IR bands as we continue to seek new worlds elsewhere in space.

Reporting multiple Twitter abuses is now less tedious

Reporting multiple Twitter abuses is now less tedious

Twitter is stepping up how it deals with rule-breaking accounts by finally allowing users to flag errant tweets en masse.

Starting today, anyone can submit multiple offending tweets into a single report, which the social media service says will greatly expedite the review process when investigating abuse on the site.

Reporting individual tweets – the standard operating procedure for taking down harmful accounts until now – yielded mixed results due to how little context a single flagged missive could give on the situation.

By allowing multiple tweets in a report, users can provide better background as to why a user is violating the site’s terms of service or the safety of another account. It also ensures Twitter’s administrators aren’t get bombarded with alerts over a single problem account.

Tackling Twitter’s problems

Offenses punishable by suspension are listed in full on Twitter’s policies page, and they include using the service for unlawful conduct, spamming, and abusive behaviors such as harassment or doxxing. Upon investigation, Twitter can then lock the account temporarily or shut it down completely.

Twitter’s updated reporting system is rolling out today for the service’s iOS, Android and web versions, with plans for users worldwide to have access to the feature in the near future – much to the relief of many prolific tweeters.

Online harassment has been a nagging issue for the microblog for a while, a wound picked at by controversies like Gamergate and other instances of toxic behavior thriving on the site. These led Twitter to develop its own super-squad to combat trolls earlier this year.

Not only is Twitter’s updated reporting system a reduced headache for victims of abuse, it’s also a solid step forward for the site as it’s confronting issues head on rather than letting them fester, which has led to users fleeing for safer-feeling waters.

Facebook’s latest report shows user data requests are on the rise

Facebook's latest report shows user data requests are on the rise

Facebook released more than its gloat-worthy earnings report this week. The social network also put out its most recent transparency report, detailing the many occurrences when governments worldwide requested it hand over data on its users.

Facebook’s newest Global Government Request Report breaks down government petitions filed with the site over the latter part of 2015. The report also details how many accounts were included across all requests, as well as how many of those requests were the acted upon.

According to the report, government requests rose 13% from the first half of the year to the second, going up from 41,214 to 46,763. Items restricted due to violating local law also rose from 20,568 to 55,827 by the end of the year.

And for the first time, Facebook was able to relay exactly what type of requests were made by the US government, thanks to changes in the country’s legal system.

For example, of the 19,235 requests made by the US government for information on over 30,000 accounts – of which Facebook agreed to 81% of – more than half came with search warrants.

Also, 60% of US government requests received in the past year packed a non-disclosure order, meaning that users couldn’t be notified if Facebook carried out the request.

Take downs

The company also detailed how many posts were taken down by authorities in different countries.

One example included blocking 366 pieces of content in Germany that didn’t comply with the country’s ban on volksverhetzung (incitement of hatred) or Holocaust denial.

Facebook also said that in France, 32,100 of the 37,695 take down cases were for a single photo related to the November 2015 Paris terror attacks, citing French laws “related to protecting human dignity.”

The social network reassured the public that it doesn’t give governments “backdoor access” to its site, meaning that any information supplied by Facebook to authorities is handled exclusively in-house. (That’s not to say that the company doesn’t have any backdoors – that fact was discovered recently by a white hat hacker who, thankfully, was working with Facebook to patch up vulnerabilities.)

Facebook’s stance on transparency and user security go hand-in-hand. The company says that it “scrutinizes each request for user data… for legal sufficiency” and supports privacy initiatives like the United States’ Email Privacy Act, which would require search warrants before law enforcement can ask tech companies to hand over someone’s emails.

Facebook has been providing data on government requests since 2013, and data as far back as January 2013 – June 2013 is available to view.

Don’t retweet something if you want to remember it

Don't retweet something if you want to remember it

Uh-oh. Psychologists have found that sharing information with your friends makes it much harder to remember what that information was.

In experiments at Beijing University, a bunch of students were split into two groups and presented with a series of messages from Weibo, China’s Twitter equivalent. After reading each one, one group was given the option to share it or go on to the next message, while the other was only allowed to go on to the next message.

After going through all of the messages, the students were tested on their content. Those in the group who had the option to share the messages gave almost twice as many wrong answers and demonstrated poor comprehension.

“For things that they reposted, they remembered especially worse,” said Qi Wang, who co-authored a paper on the results in the journal Computers in Human Behavior.

Cognitive Overload

Why? Well, the researchers reckon that the students who had the choice to share or not share got distracted by that choice, leading to “cognitive overload”. Merely making the decision consumed cognitive resources that then couldn’t be spent on understanding the message.

To test that theory, the researchers did a second experiment – getting the students to read a bunch of Weibo messages (with the same share option offered to one group) then read an unrelated magazine article. Again, those with the option to share performed worse when quizzed on the article’s contents, and when asked to rate the cognitive demands of the message-viewing task they confirmed a higher cognitive drain.

“The sharing leads to cognitive overload, and that interferes with the subsequent task,” Wangsaid. “In real life when students are surfing online and exchanging information and right after that they go to take a test, they may perform worse.”

Wang added that web designers should take these results into account, creating interfaces designed to promote rather than interfere with cognitive processing. “Online design should be simple and task-relevant,” she said.

Now go share this story with your friends. Unless you want to remember it, of course.

Using these words will make your eBay items sell for more, apparently

Using these words will make your eBay items sell for more, apparently

Selling something on eBay? You might want to give a bit more consideration to the language you’re using.

Researchers at Birmingham City University examined over 68,000 items being sold on the UK eBay site and discovered that the words you use can affect how much money you make.

For example, fragrances that are described as “genuine” sell for an average of £21 while “authentic” ones fetch £34. A watch described as having “resistance” can apparently expect to sell for 50% more than a “resistant” watch, despite the difference in wording being very slight.

The study examined more than 15 million words, which were analysed using WebCorp software. Product listings were downloaded from the site over a 70 day period before being put through the WebCorp system.

Some of the findings are more understandable than others. The researchers note that “on-ear” headphones sell for nearly three times as much as “in-ear”, which doesn’t seem surprising at all to us considering these are two different types of product (and on-ears do tend to be more expensive).

But others discoveries are definitely useful to know if you’re an eBay seller. Using the word “seats” when selling a car tends to be more lucrative than the word “seat” or “seater”, while DVD sellers are better off using the word “seasons” over “series”.

Low value terms

Obviously, words like “insufficient” and “defects” fall under the ‘low value terms’, while the trend of words like “Apple” and “Retina” being better sellers will probably be more to do with the value of the products themselves. Grammatical errors are also a big no-no.

The flip side of this is that the researchers could pick out trends among different types of sellers. They found antique sellers were most likely to write more personally than sellers in other categories, using words like “I”, “me” and “my”.

Meanwhile car sellers tend to use the term “second-hand” less than others, opting for words like “reliable”, “honest”, “clean” and “reluctant” instead.

Andrew Kehoe, one of the project researchers, said: “The term ‘second-hand’ seems to have a stigma attached when it comes to cars, but people will happily use it to sell smaller items like books or DVDs. We’ve found that the language used in eBay descriptions really does have an impact on whether items sell and for how much.”

Other browsers get the push as Microsoft makes Cortana Edge-only

Other browsers get the push as Microsoft makes Cortana Edge-only

One of the major moves Windows 10 made was to bring Cortana to the desktop – whether you wanted her there, or not – and now Microsoft has announced a big change to the way search works with its digital assistant on PC.

From now on, when you use Cortana’s search box (bottom-left, on the taskbar), Microsoft’s Edge (which is also unique to Windows 10) will be the sole browser which the OS launches, with other third-party browsers being locked out. Bing will also be used as the search engine, again as the only option.

Previously, it was possible to have a Cortana search use Chrome, or Google’s search engine, using simple workarounds (i.e. installing an extension). Microsoft has now shut these avenues down so Cortana is exclusively tied to Edge and Bing.

(Note that this change does not affect browser usage within the OS – it simply pertains to searches which are initiated via Cortana, so if you don’t use the digital assistant, or the aforementioned workarounds, you won’t notice anything different).

Searching questions

Naturally, this has prompted some unhappy noises from users who don’t appreciate having their options narrowed and being dictated to by Redmond, but the company argues it has made the change for good reason.

In a blog post, Microsoft noted that redirecting Cortana to use other browsers or engines results in a less reliable search experience, because the digital assistant has been specifically designed to work with Edge and Bing (indeed, Cortana is powered by Bing as Redmond frequently reminds us).

Microsoft provides a number of examples of what this close integration brings, such as when you search for ‘Bluetooth not working’ via the Cortana box, Bing will produce a ‘rich video help answer’ that’s only available on Windows 10 with Edge.

Or if you search for ‘Pizza Hut’, when you click through to the pizza chain’s website in Edge, Cortana will automatically show you the closest restaurants. Obviously that wouldn’t happen with, say, Firefox, and so you can see where Microsoft is coming from here.

Microsoft stated: “The continuity of these types of task completion scenarios is disrupted if Cortana can’t depend on Bing as the search provider and Microsoft Edge as the browser. The only way we can confidently deliver this personalized, end-to-end search experience is through the integration of Cortana, Microsoft Edge and Bing – all designed to do more for you.”