Wednesday, July 30, 2014

iFixit teardown reveals Amazon Fire Phone not designed for easy DIY repairs

The Amazon's new Fire Phone is not DIY friendly. Or so say the folks at iFixit, who published a teardown on the new smartphone yesterday, giving it a ranking of 3 out of 10 for repairability, with 10 being the easiest to fix.

The gadget repair experts warned that the phone is not modular, with components often sharing cables. Also, the four extra cameras on the front, that track a user's head movements to enable special screen effects, make things much more difficult, as they are difficult to replace individually.

iFixit noted that the construction of the Fire Phone is similar to the iPhone 5 because of its bottom screws. Also, there is no carrier logo, suggesting that more carriers, besides AT&T, will eventually sell the device.

The teardown reveals that the phone contains chips from Qualcomm, NXP, and Samsung Electronics. While the device's radio frequency, power amplifier, audio, and Wi-Fi chips are from Qualcomm, 32GB of NAND memory chips, used for photo, music, and media storage, and 2GB of DRAM memory chips come from Samsung.

NXP contributed a near field communication chip, enabling features such as mobile payments. The smartphone also includes a touch screen controller from Synaptics, and a communications chip from Skyworks.

Priced at $649 contract-free or $199.99 with a contract with AT&T, the 4.7-inch Fire Phone sports a Qualcomm 2.2GHz quad-core Snapdragon 800 CPU, Adreno 330 GPU, 2GB RAM, a 13-megapixel rear-facing camera, and a 2.1-megapixel front-facing camera. The device hit stores this week and has received lukewarm reviews.

Next Jimmy Kimmel tells people this $20 Casio watch is Apple's new iWatchPrev Google reportedly seals the deal on Twitch $1 billion acquisition$(function(){$('img', 'img-wrap', 'li', '.related-products .teaser-list').equalHeights();});

View the original article here

Jimmy Kimmel tells people this $20 Casio watch is Apple's new iWatch

Sorry, I could not read the content fromt this page.

View the original article here

Amazon reports steep loss after launching bevy of new products, services

Amazon on Thursday reported poorer-than-anticipated second quarter results. The online retail giant posted a loss of $126 million, or 27 cents per share - far greater than the 15 cents per share loss that analysts had forecasted.

In the year-ago period, Amazon posted a loss of just $7 million, or 2 cents per share. Revenue, however, rose 23 percent to $19.34 billion and was spot-on with expectations of $19.33 billion. That's up from $15.7 billion during the same period in 2013.

Amazon launched a number of new products and services this quarter which is the primary reason why losses were so steep.

As Amazon founder and CEO Jeff Bezos outlined in the earnings report, his company recently introduced Sunday delivery to a quarter of the US population. They also launched European cross-border two-day delivery for Prime, launched Prime Music, created three original kids TV series, launched the Fire TV set-top box, launched the Fire smartphone and launched Kindle Unlimited.

He said they are continuing to work hard to make the Amazon customer experience better and better.

Despite all of that, however, investors weren't pleased with Amazon's results. Share value plummeted more than 10 percent on the news, losing more than $36 of its value.

Looking ahead, Amazon expects third quarter revenue between $19.7 billion and $21.5 billion while analysts anticipate revenue around $20.81 billion.

Next Microsoft explains quantum computing in this easy-to-follow videoPrev Jimmy Kimmel tells people this $20 Casio watch is Apple's new iWatch

View the original article here

Microsoft explains quantum computing in this easy-to-follow video

Sorry, I could not read the content fromt this page.

View the original article here

Samsung to host annual developers conference at Moscone West this November

A date and location have been announced for Samsung's next developers conference. The annual event will kick off on November 11 at Moscone West (the same venue Apple and Google used for their dev conferences earlier this year) in San Francisco, California, and will run through the 13th according to a post on the company's official global blog.

Last year's inaugural conference saw the introduction of a number of Samsung SDKs including the Mobile SDK, a Smart TV SDK, a Multiscreen SDK and the KNOX Enterprise SDK beta.

While Sammy didn't give any solid information on what we can expect that this year's conference, the company did drop a hint or two about updates to its Mobile SDK and well as its S Health SDK, Tizen SDK for Wearable and the Samsung Accessory SDK.

Those expecting to see some new hardware at the conference will likely be disappointed. As we've seen from others, these events are typically focused on the software side of things only. That's alright, however, as Samsung's next major hardware release, the Galaxy Note 4, is expected to arrive on September 3 at an Unpacked event just ahead of the annual IFA trade show in Berlin, Germany.

Interested parties are invited to submit their e-mail address to be notified when more information becomes available. No word yet on when registration will open or how many seats will be available.

Next AT&T's Q2 earnings report misses expectations despite rise in revenuePrev Facebook exceeds expectations again with Q2 2014 results

View the original article here

Tuesday, July 29, 2014

AT&T's Q2 earnings report misses expectations despite rise in revenue

AT&T announced that its earnings declined 7.2 percent, while its revenue was up 1.6 percent from a year ago. The announcement came as part of the company's second quarter earnings report, which it released yesterday, a day after Verizon posted strong profits in the second quarter.

The telecommunications giant posted a profit of $3.55 billion, or 68 cents a share, compared to $3.82 billion, or 71 cents a share, in the year-ago period. On an adjusted basis, the company earned 62 cents a share, missing analysts’ expectations by a penny. The revenue figure, which stood at $32.6 billion, also missed Wall Street’s consensus estimate of $33.22 billion.

The company said it added more than 1 million postpaid subscribers -- those who sign a long-term contract or pay toward the end of the month. This is the largest for the company in the last five years, bringing the total to nearly 117 million.

Out of these newly added postpaid subscribers, around 707,000 were smartphone customers, while 366,000 were tablet customers. Total wireless net additions were 634,000 amid the loss of 405,000 prepaid customers.

Speaking about the broadband and television side of the business, the company said it added 488,000 net new U-verse Internet customers and 190,000 net new U-Verse TV customers in the last quarter.

The company also announced that its proposed acquisition of DirecTV is steadily progressing. AT&T's CFO John Stephens said in a conference call that the deal had received approval from regulators in Brazil and several U.S. states.

Next Sony to settle PlayStation Network class action suit for $15 million in goods and servicesPrev Samsung to host annual developers conference at Moscone West this November

View the original article here

Get The Sims 2 for free on Origin this week

From now until July 31st, EA is giving away copies of The Sims 2 Ultimate Collection on Origin for free. To redeem your copy, open up Origin on your PC and log in, head to the Games tab, click on Redeem Product Code and enter the code "I-LOVE-THE-SIMS".

The Ultimate Collection bundles in every expansion pack and stuff pack alongside the original game, giving you access to 18 extra content packs for the game. Unlike Steam's free play weekends, after you've redeemed your copy of The Sims 2, it'll stay in your Origin library forever.

The Sims 2 launched in 2004 to critical acclaim, and continued to see content pack releases until late 2008. In 2009 EA released a sequel to the popular life simulator, The Sims 3, which as of today has a whopping 20 content packs available for it.

The next game in the series, The Sims 4, will launch on September 2, 2014. Despite the game containing new features, such as an improved Create-a-Sim tool and better representations of personalities, other features available in past games have been controversially removed, including swimming pools and the toddler life stage.

Next Intel Pentium Anniversary Edition Review: Back to its Legendary Overclocking RootsPrev The Nostalgia Machine delivers the hit songs from any year you select

View the original article here

Pinterest has a better male-to-female workforce ratio, but lags in ethnic diversity

Pinterest has revealed that its 300-strong workforce is 60 percent male and 40 percent female. The revelation came as part of the diversity report that the company announced yesterday.

Except for Yahoo that recently reported almost similar male-to-female workforce ratio (62:37), the social sharing site's gender diversity is better than most of the other well-known Silicon Valley companies. Though the company is sailing in the same boat when it comes to ethnic diversity of its workforce, which is predominantly white and Asian.

50 percent of the Pinterest's workforce is composed of whites, 42 percent of Asians, while African-Americans and Hispanics make up just 1 percent and 2 percent, respectively.

"Today we're taking our latest step by giving a more holistic look at our demographics across the company. We're not close to where we want to be, but we're working on it", said Pinterest software engineer and tech lead Tracy Chou, who called for greater transparency last year about the proportion of women tech employees in the industry.

Since then, more than 150 startups have shared their women in engineering numbers, and some of the largest and most prominent tech companies like Google, Facebook, Twitter, and more have also published their stats.

Chou also said that the company is looking to improve its workforce diversity by working with groups like Girls Who Code, CODE2040, Girls Teaching Girls to Code, Anita Borg Institute, Hackbright Academy, and Out for Undergrad.

"While we’ve made some progress in diversifying gender at the company, we haven’t done as well in representing different ethnicities, and we’re focused on getting better", she said.

Next Google reportedly seals the deal on Twitch $1 billion acquisitionPrev Facebook wants to integrate car-hailing service Uber into Messenger

View the original article here

Intel Pentium Anniversary Edition Review: Back to its Legendary Overclocking Roots

For more than a decade tech-savvy users on a budget would commonly buy a sub-$100 CPU and achieve performance comparable to $200-$300 chips by overclocking. The practice dates back to the early Pentium and Celeron days and was a practical way to extract more performance out of low-end systems until Intel locked its Celeron, Pentium and Core i3 ranges about four years ago.

In fact, even most Core i5 and i7 processors have locked clock multipliers, forcing users to spend big to overclock. The last time we saw overclockable budget CPUs from Intel was during its Core 2 days when you could pick up a Core 2 Duo E7200 for a whisker over $100 and easily push it to 3.8GHz, a 50% boost that let the chip crush the then $850 Core 2 Quad Q6600 and $266 Core 2 Duo E8600.

Although the clock multiplier of the non-Extreme Edition Core 2 processors was still locked, this architecture responded very well to front-side bus (FSB) overclocking. The E7200, for example, came clocked at 2.53GHz using a 266MHz FSB with a 9.5x clock multiplier, yet it would happily accept a 400MHz FSB, resulting in a frequency of 3.8GHz!

In a move to improve CPU performance, the FSB was eliminated and we now have what is known as the base clock. Unlike the front-side bus, the base clock only allows for very minor alterations and overclocking it by just 10MHz isn't an easy task.

Even Intel's most extreme overclocking-orientated processors, such as the Core i7-4790K, are tuned using just the clock multiplier. Moreover, it means the cheapest Intel CPU available to overclockers is the Core i5-4670K, which isn't exactly made for budget systems at $240.

However, to mark the 20th anniversary of its Pentium brand, Intel has released a special fully unlocked Haswell dual-core Pentium G3258 for $72 -- just what the overclocking community has been waiting for.

Today we not only plan to overclock the Pentium G3258, but demonstrate its capabilities in two builds that the most diehard gamer could be proud of. The systems are based on Asrock's Z97 Anniversary motherboards, one is a standard ATX and the other uses the micro ATX version. Below is the full list of components used for each build.

Corsair Dominator Platinum 8GB 2133MHzCorsair CS Series Modular CS650M

Kingston Fury DDR3 8GB 1866MHzSilverstone Strider Essential ST60F-ESB 600WNext Page ATX Build Core Components

View the original article here

Sony to settle PlayStation Network class action suit for $15 million in goods and services

Sony has agreed to settle a class action lawsuit brought about by the 2011 PlayStation Network data breach, an event that resulted in the theft of names, addresses and potentially even credit card information belonging to 77 million members.

According to a report from Polygon, Sony has signed a preliminary agreement valued at $15 million. Instead of cash, however, plaintiffs would receive goods and services from Sony such as free PlayStation 3 and PlayStation Portable games, free themes, free subscriptions to PlayStation Plus, free subscriptions to the Music Unlimited service and free SOE Station cash (Sony's virtual currency to buy in-game items).

Those that didn't participate in Sony's "Welcome Back" package following the security breach will get to choose two separate benefit options or two instances of one PSN benefit option. Individuals that did accept Sony's package can receive a game benefit, a theme benefit or a PlayStation Plus subscription benefit.

Lawsuit participants will receive benefits on a first-come, first-serve basis. All of this, of course, is dependent on the judge signing the settlement.

Sony was hit with a couple of massive data breaches back in 2011 and has been working to clean up the mess ever since. The first breach in April forced Sony to shut down the PlayStation Network for several weeks.

The company hired a former Homeland Security officer to help get things in order after the breach but of course, the damage had already been done. At the time, Sony estimated it would cost $171 million to clean up the mess associated with the breach.

Next Google buys 3D graphics startup drawElements for north of $10MPrev AT&T's Q2 earnings report misses expectations despite rise in revenue$(function(){$('img', 'img-wrap', 'li', '.related-products .teaser-list').equalHeights();});

View the original article here

The NES30 controller is a throwback that looks and feels like a true classic

No matter which controllers pass through our hands over the years, you've still got to appreciate the classics. That's why this Bluetooth gamepad is so appealing. It combines the new and the old in a manner that looks and feels like one of the original, authentic NES controllers you could have pulled straight from your attic from the glory days of gaming. Best of all, it actually works. 

8Bitdo's NES30 controller may feature a few extra buttons, but it's as close to a clone as you can get to the iconic input device, while also including surprisingly broad device support too. Specifically, the controller works with Android, iOS, OS X, Windows, and can even double as a Wii remote.

It'll set you back a good $40 if you want to channel the glory days of the NES while using while gaming on your computer or mobile devices. That price includes the Xtander accessory that detaches from the controller and doubles as a stand for your smartphone or tablet. As for the NES30 itself, it supports a wired USB connection besides Bluetooth, while keys include a 4-way D-pad, four action buttons along a pair of shoulder buttons, select and start.

The NES30 is available from eBay and has already gotten a few rave reviews from GBATemp readers and those who have already given it a test drive. Let us know if you decide to pick one up, if anything, you'll have an awesome replica to display in your home that you can store apart from your classic consoles.

Next Twitter diversity report looks no different than other Silicon Valley tech giantsPrev Google buys 3D graphics startup drawElements for north of $10M

View the original article here

Monday, July 28, 2014

The new Oculus Rift kits have begun shipping out to developers

Oculus VR is now shipping the latest developer version of the Rift headset to early buyers, according to emails sent out last night. Developers and hobbyist that pre-ordered version 2 of the headset (DK2) have begun receiving notifications that it's coming soon. In a post on Reddit regarding the notifications, Oculus VR community manager Andres Hernandez said “We’ve started (shipping), and it’s the real deal."

The original dev kit has been out in the wild for a while now, but DK2 comes with a number of enhanced features as the closest version developers have seen to what consumers will see come available. It boasts a much higher resolution with 1,080 horizontal lines as well as an external camera to better translate player motion. 

The company will be shipping 10,000 units out this month. Previously, Oculus said it had surpassed 45,000 preorders on DK2 bringing the total sales on Rift to over 100,000. The shipping of DK2 brings the company one step closer to an expected late 2014/early 2015 consumer release.

As we last reported, while Oculus is saying 10,000 units are shipping this month, some developers with July delivery dates may get shipping times pushed back to August. Remember, if Oculus catches you selling your DK2 preorder on eBay for a higher price they will cancel the order like they did to this person.

Next Google offers three months of free streaming music to celebrate Chromecast anniversaryPrev Microsoft explains quantum computing in this easy-to-follow video

View the original article here

Twitter diversity report looks no different than other Silicon Valley tech giants

Diversity reports are the new hot topic in Silicon Valley as Twitter is next in line for everyone to comb over. Unsurprisingly, the microblogging platform on Wednesday joined a growing list of tech companies dominated by white males, just as we've seen from Facebook, Google, LinkedIn and Yahoo in recent months.

The report indicates that 70 percent of Twitter's overall workforce are male and 90 percent of its tech-oriented employees are also men. Non-tech roles - things like marketing, public relations, human resources, etc. - are gender split 50/50 while 79 percent of leadership roles are held by, you guessed it, men.

The ethnic breakdown, meanwhile, reveals Twitter employs mostly whites and Asians at 59 and 29 percent of the overall workforce, respectively. That same ratio is pretty much mirrored across both tech and non-tech jobs. Leadership roles are held primarily by whites at 72 percent followed by Asians at 24 percent. Other races make up just four percent of all leadership positions.

Janet Van Huysse, Twitter's vice president of diversity and inclusion, said they are keenly aware that Twitter is part of an industry that is marked by dramatic imbalances in diversity and they are no exception.

By becoming more transparent with employee data, open in dialog throughout the company and rigorous in recruiting, hiring and promotion practices, Van Huysse added, they are making diversity an important business issue for themselves.

Next Valve updates the Steam Controller with an analog stickPrev The NES30 controller is a throwback that looks and feels like a true classic

View the original article here

Google offers three months of free streaming music to celebrate Chromecast anniversary

Today marks the one year anniversary of Chromecast, Google's uber affordable streaming dongle that allows you to push content from your phone, tablet or notebook to your television. It was unveiled during a media event alongside the second generation Nexus 7 tablet and Android 4.3 Jelly Bean with a jaw-dropping $35 price tag.

To celebrate the occasion, Google has released some updated usage stats as well as a sweet offer for existing Chromecast owners.

Since its launch, Chromecast users have cast media more than 400 million times. Google has also added hundreds of apps and a mirroring feature for Android devices over the year. We still don't know exactly how many dongles Google has sold but given its low price, the fact that initial batches sold out quickly and its availability in more than 30,000 stores across 20 countries, it's probably a pretty big number.

What's more, Google is offering all Chromecast owners a 90-day free trial of its premium music screaming service, Google Play Music All Access. The service launched at last year's I/O conference and is one of several offerings consumers can choose from in the crowded streaming music space.

Chromecast has spawned a couple of copycats like the Roku Streaming Stick although none have been able to match Google's $35 price point.

Have you tried Chromecast yet? If so, what are your thoughts on it? Let us know in the comments below!

Next Facebook exceeds expectations again with Q2 2014 resultsPrev The new Oculus Rift kits have begun shipping out to developers$(function(){$('img', 'img-wrap', 'li', '.related-products .teaser-list').equalHeights();});

View the original article here

Google buys 3D graphics startup drawElements for north of $10M

In an effort to boost its mobile 3D graphics capabilities, Google has acquired 3D graphics firm drawElements. Financial terms of the deal were not disclosed, but Arcticstartup is reporting that the sale price was over $10 million.

Launched in 2008, the Helsinki, Finland-based company specializes in analyzing and assessing mobile 3D graphics. Its primary product, called dEQP, is a powerful toolkit used to benchmark the accuracy, feature conformance and stability of openGL ES and OpenCL GPUs.

It enables detailed quality comparisons between different vendors and GPU architectures, as well as providing high-quality tools for analyzing and debugging any issues uncovered by the tests.

"Over the next few months, we’ll be working with our colleagues on the Android team to incorporate some of our technology into the compatibility test suite", the startup said on its website.

For Google, whose Android operating system runs on hardware from different smartphone and tablet manufacturers, each having its own set of GPU parameters, the move seems logical as the company will likely use the newly acquired technology for device standardization and to make sure manufacturer fragmentation is low across devices. The company hasn't yet officially announced the deal.

While the startup's management will reportedly move to the search giant’s headquarters in Mountain View, California, the remaining members of the team will stay in Finland.

DrawElements is one of the smaller acquisitions made by Google so far this year. The company bought thermostat maker Nest Labs for $3.2 billion and satellite company Skybox Imaging for $500 million, among others.

Next The NES30 controller is a throwback that looks and feels like a true classicPrev Sony to settle PlayStation Network class action suit for $15 million in goods and services

View the original article here

Google reportedly seals the deal on Twitch $1 billion acquisition

Google has agreed to purchase game livestreaming service Twitch for $1 billion according to sources familiar with the matter. Both companies have declined to comment as of writing.

VentureBeat, who is reporting the news, admits they don't have all of the specifics on the deal or when it'll be officially announced, but they claim Twitch investors that participated in past funding rounds are pleased with the returns they are getting.

News of the merger first broke back in May when Variety claimed YouTube was looking to spend $1 billion to acquire Twitch. If announced under the YouTube banner, it would be the largest acquisition related to YouTube since it was acquired by Google back in 2006 for $1.65 billion.

Twitch has exploded in popularity over the past few years. In June 2011, the service boasted 3.2 million monthly active users. That figure has since jumped to more than 50 million monthly active users with more than 1.1 million people broadcasting each month. Over 13 billion minutes of video are watched each month through the service.

Twitch's popularity goes hand in hand with the rise in popularity of eSports in general. Valve inked a deal with ESPN just last week for coverage of The International Dota 2 Championships, an eSports tournament with a prize pool of more than $10 million. A five-man Chinese team by the name of NewBee took home the $5 million first place prize.

Next iFixit teardown reveals Amazon Fire Phone not designed for easy DIY repairsPrev Pinterest has a better male-to-female workforce ratio, but lags in ethnic diversity

View the original article here

The Nostalgia Machine delivers the hit songs from any year you select

I'm a sucker for nostalgia. There's something special about stepping back in time (metaphorically speaking, of course) whenever I hear a certain song, look at an old photo, etc. It's a reaction that was triggered when I recently came across the aptly named Nostalgia Machine.

It's a simple concept really. Select a year from the drop-down menu and you'll be presented with a lengthy list of hit songs from that year pulled directly from YouTube.

Sit back and enjoy!

Found is a TechSpot feature where we share clever, funny or otherwise interesting stuff from around the web.

Next Get The Sims 2 for free on Origin this weekPrev Valve updates the Steam Controller with an analog stick

View the original article here

Sunday, July 27, 2014

Valve updates the Steam Controller with an analog stick

Valve continues to tweak the elusive Steam Controller, recently adding in an analog stick into the design alongside the existing setup that includes two touchpads, several trigger buttons and traditional faceplate buttons.

The original design of the Steam Controller had a touchscreen in the center, which game developers were set to harness through an API. A revision of the controller in early 2014 replaced the touchscreen with more buttons, including a d-pad, in a design that was closer to traditional console controllers.

It's possible that Valve wants to make the Steam Controller more accessible to first time users, explaining the addition of an analog stick alongside two touchpads. Early reports from people who have tried out the controller say the touchpads take several hours to get used to, and even longer to achieve the promised PC-like accuracy.

There's still no definite word on when the Steam Controller, or Steam Machines for that matter, will be available to the public. In May, Valve pushed back the project's launch to 2015, so there's still a chance Valve will once again alter the design of their gamepad.

Next The Nostalgia Machine delivers the hit songs from any year you selectPrev Twitter diversity report looks no different than other Silicon Valley tech giants

View the original article here

Facebook exceeds expectations again with Q2 2014 results

Facebook has reported its second quarter numbers exceeding analyst expectations for the eighth time in a row and resulting in positive reactions in the marketplace. The company shows growth in revenue, income and user numbers. Compared to Q2 of last year, its revenue jumped 61% to $2.91 billion and its net income grew 139% to $791 million.

Facebook is also seeing user growth across the board with 1.32 billion active users and 829 million daily active users, up 14% and 19% year-over-year respectively. When talking specifically about mobile, Facebook's Q2 2014 numbers are even more impressive. The company hit 1.07 billion monthly active mobile users, up nearly 6% from last quarter, and its daily mobile users jumped 7.4% from 609 million to 654 million.

Facebook has also now set records around the world for its global rise in average revenue per user (ARPU), according to reports. The company seems to be pulling in more cash on a user by user basis, even as it continues to grow. 

As some have suggested, the growth numbers are fairly impressive for a company that already has so many users, but the expansion also rises costs. Facebook spent $1.52 billion in the second quarter of 2014, up about 22% from the year prior.

However, despite the heavy expenses, Facebook clearly had a strong quarter. "Our community has continued to grow, and we see a lot of opportunity ahead as we connect the rest of the world," Mark Zuckerberg said during a statement accompanying the Q2 results. 

Next Samsung to host annual developers conference at Moscone West this NovemberPrev Google offers three months of free streaming music to celebrate Chromecast anniversary

View the original article here

Google X is working on a project to map perfect human health

Google X, the search giant's research and development arm, has set off on a new endeavor called the Baseline Study project. The goal of the project is to one day be able to detect health risks such as heart disease earlier in a patient's life such that preventative measures can be taken before it's too late.

The project is being led by Dr. Andrew Conrad, a molecular biologist credited with creating a cheap way to scan donated blood for HIV. He joined Google in March of last year and has put together a team of roughly 70 to 80 experts in the fields of biochemistry, imaging, molecular biology, optics and physiology.

The Baseline Study got under way earlier this summer through an unnamed clinical testing firm where doctors began collecting bodily fluids like urine, blood, saliva and tears from 175 anonymous volunteers. From there, Google will use its massive computing power to try and find patterns called biomarkers.

The hope is that these biomarkers will help researchers be able to detect health issues before a person even shows signs of them.

For example, a specific biomarker could reveal whether or not someone is able to break down fatty foods efficiently. Those that lack the biomarker could be at risk for early heart attacks in the future. By noticing this trait early and modifying their behavior now, they may be able to avoid the risk altogether.

Next Facebook wants to integrate car-hailing service Uber into MessengerPrev Users can now make web-based calls on Google Voice via Hangouts without a G+ account

View the original article here

Users can now make web-based calls on Google Voice via Hangouts without a G+ account

There are a number of rumors out there pointing at Google merging its Voice service with Hangouts. The company has been unifying its services and social network for some time, and now it looks like Google Voice is getting another step closer. 

Google's Alex Wiesen took to Google+ recently to announce that users can now make web-based phone calls using Hangouts from directly on the Google Voice site. Requiring no connection to Google+, users can now hit the option from within the "phone to call with" drop down menu. Gmail users have had very similar functionality right in the inbox for a while, but the service was never available directly on Google Voice. 

As some have mentioned, the move to have the service being available even to those without a Google+ profile is an interesting one considering how aggressively the company has been integrating its social network in with its other services. 

While the update is a minor one, some users will likely find it a handy addition. The move doesn't directly point at the services one day merging into one, but even without Google+ being a requirement with Hangouts calls in Google Voice, is still feels like a fully unified service is where the company is headed.

Next Google X is working on a project to map perfect human healthPrev China Telecom to carry Xbox One starting this September

View the original article here

Phone unlocking to become legal in the United States again

Back in 2012, the United States Copyright Office effectively outlawed the unlocking of cellphones, making it very difficult for someone to use their phone on a carrier other than the one it was purchased for.

A new bill that's set to be signed into law by president Barack Obama will change this, making cellphone unlocking legal in the country. Known as the "Unlocking Consumer Choice and Wireless Competition Act", the bill has already passed through both Congress and the Senate.

The passing of the bill follows many months of activism from groups who believe consumers should have the right to unlock devices they've legally purchased. While having an unlocked handset is necessary for switching providers without purchasing a new device, it's especially helpful for travelers, who might want to use a local SIM over often-expensive global roaming options.

Obama said in a statement that "the bill Congress passed today is another step toward giving ordinary Americans more flexibility and choice, so that they can find a cell phone carrier that meets their needs and their budget". 

Next Verizon is preparing to throttle 4G LTE unlimited data usersMore See a summary of the most popular topics in tech right now

View the original article here

Verizon is preparing to throttle 4G LTE unlimited data users

Starting October 1, Verizon customers on unlimited data plans that meet certain criteria will see their 4G LTE wireless connection throttled as part of what the carrier calls "Network Optimization," a practice that previously only affected 3G connections.

The good news is that the list of criteria that must be met for speed reduction to occur is pretty lengthy. Here's the full rundown according to Verizon:

Top 5% of data users (you use 4.7GB of data per month or more)Enrolled on an unlimited data plan or featureHave fulfilled their minimum contract termAre attempting to use data on a cell site that is experiencing high demand

Keep in mind that a customer must meet all of the above criteria before reduced speeds kick in. Also, once a user has been throttled, they may continue to be impacted for the rest of the current billing cycle and through the next billing cycle.

The key here seems to be whether or not you're on a high demand network or not. Verizon doesn't consider this to be true throttling as speeds aren't reduced for the entire billing cycle, 100 percent of the time. For example, if you are a heavy data user and there's plenty of available bandwidth, your speeds won't be impacted.

Unlimited data plans were once an effective marketing method but as we slowly transitioned to a data-heavy society, carriers realized they could make more money by eliminating unlimited data plans and moving customers to tiered plans where they can charge a lot for overages.

According to DroidLife, more than 20 percent of Verizon data customers are still on unlimited plans. Verizon, however, claims this figure isn't accurate and the actual percentage is lower.

Next China Telecom to carry Xbox One starting this SeptemberPrev Phone unlocking to become legal in the United States again

View the original article here

Saturday, July 26, 2014

Facebook wants to integrate car-hailing service Uber into Messenger

Facebook's Messenger app is not even close to where Mark Zuckerberg would like it. The vision is to transform Messenger from a pure communications tool into an e-commerce platform not unlike what Asian competitors Line and WeChat have pulled off.

A potential first step in that direction could be an integration with car-hailing service Uber.

Zuckerberg has held private talks with Uber chief Travis Kalanick about doing just that according to sources familiar with the matter as reported by Re/code. At least one source noted the whole idea is very conceptual at this stage and nowhere near execution

Both Facebook and Uber declined to comment on the subject.

Integrating Uber with Facebook could make a lot of sense for both companies. For Uber, it would expose the service to far more potential customers. Back in April, Zuckerberg revealed that Messenger had 200 million monthly users.

Facebook, meanwhile, would be one step closer to making Messenger a well-rounded app with more than just communication to offer. They'd also get more credit card numbers on file which could be useful for future endeavors.

The only potential hiccup would be what to do with Uber's own mobile app. Those that already use it would likely continue to do so while newcomers may be confused at whether or not they should use the standalone app or the one integrated in Mesenger.

Next Pinterest has a better male-to-female workforce ratio, but lags in ethnic diversityPrev Google X is working on a project to map perfect human health

View the original article here

China Telecom to carry Xbox One starting this September

Microsoft's Xbox One is heading to China. The next generation console will soon be sold through China Telecom, one of the nation's leading telecommunications companies according to a report from Reuters.

The Xbox One will go on sale in September and will be the first gaming console sold in mainland China since the country's government issued a ban on such devices in 2000. The ban, put in place over concerns about how violence and mature content may affect Chinese youth, was recently lifted.

A China Telecom spokesperson told the publication that pricing for the console hasn't been decided yet. Further details regarding the launch will be revealed next week according to Microsoft China spokesperson Joanna Li.

The announcement came yesterday during a China Telecom media event focused on the Internet of Things. A set-top box and a smart television were shown as part of a smart home series called Yue Me. The Xbox One, with its heavy entertainment slant, should fit right into what China Telecom is trying to do.

It's a significant move for Microsoft as its Xbox One hasn't sold as well as Sony's PlayStation 4. Much of that had to do with Microsoft pricing their system $100 higher. Last month, however, Microsoft finally unbundled the Kinect and priced the barebones Xbox One at $399 to match the PS4.

Sales have doubled since the move and with more than 104 million China Telecom broadband subscribers in sight, the Xbox One may finally start catching up to the PS4.

Next Users can now make web-based calls on Google Voice via Hangouts without a G+ accountPrev Verizon is preparing to throttle 4G LTE unlimited data users$(function(){$('img', 'img-wrap', 'li', '.related-products .teaser-list').equalHeights();});

View the original article here

Thursday, July 24, 2014

Science Graphic of the Week: Climate Change on Tatooine

This fictional map shows how the average temperature has changed depending from 100 years before the Death Star was destroyed to 10 years after. This fictional map shows how the average temperature across Tatooine’s surface has changed in last 110 Galactic Standard years. David Ng

Just because Luke Skywalker’s home planet of Tatooine is fictional doesn’t mean it’s immune to the effects of climate change. This map shows how, in the past 110 Galactic Standard Years, Tatooine has turned from a sprawling, desert wasteland into an even hotter sprawling, desert wasteland. It comes from Tatooine’s first Intergovernmental Report on Climate Change, written by 23 droids (not really) and a human named David Ng, a molecular biologist from the University of British Columbia in Vancouver.

Ng is in cahoots with several other science writers who are using Tatooine as a device to teach real world science. Ng writes on his blog that the report is an “overly elaborate teaching prop” that he hopes will help people understand how scientists come to a consensus on climate change.

He based his report on the findings of planet Earth’s own Intergovernmental Panel on Climate Change. The IPCC’s fifth assessment report, which is being incrementally released between September 2013 and November 2014, builds on what we know about climate change: that it’s happening, and it’s being caused by our dependence on fossil fuels. The IPCC report is one of the most important scientific documents out there, Ng said in an email to WIRED, but is unfortunately incredibly boring to read. “Why not throw in a Star Wars angle to capture a few more eyeballs?” he said.

Unlike fossil fuels on Earth, water vapor from Tatooine’s unregulated water-mining industry is most likely to blame for the planet’s temperature rise. Like carbon dioxide, water vapor is a greenhouse gas that stores and emits thermal energy. (Incidentally, Luke’s aunt and uncle were moisture farmers. At least, until they were shot by Imperial stormtroopers.)

Like the real world IPCC report, one of the central concepts in the Tatooine report is radiative forcing, which measures the balance of thermal energy in the atmosphere. Based on the volume of greenhouse gas emissions, radiative forcing lets scientists make predictions about future climate scenarios.

These different scenarios have ecological implications, as entire ecosystems need to migrate or adapt to a shifting environment. As the graph below shows, certain key species are more resilient than others. Most of these scenarios don’t favor the sarlacc, which would not be able to migrate along with its prey because it’s a flesh-eating pit in the sand.

This chart shows how likely different species are to survive under different warming scenarios. Ng based this on each creature's ability to migrate. Each of the scenarios begin with a Representative Concentration Pathway. These pathways use the amount of atmospheric energy (measured in Watts per square meter) to calculate warming. The highest, RCP12.o, is the current emissions trajectory, and assumes that Tatooine will have 12 Watts of atmospheric energy per square meter by the 100th anniversary of the Death Star’s destruction. David Ng


View the original article here

Comic-Con: Guillermo del Toro Shows Us How to Pilot a Pac Rim Jaeger on Oculus

Sorry, I could not read the content fromt this page.

View the original article here

Comic-Con Liveblog, Day One: The Quiet Before the Mighty Cosplay Storm

Sorry, I could not read the content fromt this page.

View the original article here

Have a Drone? Check This Map Before You Fly It

Screen Shot 2014-07-23 at 11.53.19 AM Bobby Sudekum / Amy Lee / Mapbox

The popularity of drones is climbing quickly among companies, governments and citizens alike. But the rules surrounding where, when and why you can fly an unmanned aerial vehicle aren’t very clear. The FAA has tried to assert control and insist on licensing for all drone operators, while drone pilots and some legal experts claim drones do not fall under the FAA’s purview. The uncertainty—and recent attempts by the FAA to fine a drone pilot and ground a search and rescue organization—has UAV operators nervous.

To help with the question of where it is legal to fly a drone, Mapbox has put together an interactive map of all the no-fly zones for UAVs they could find. Most of the red zones on the map are near airports, military sites and national parks. But as WIRED’s former Editor-in-Chief, Chris Anderson, now CEO of 3-D Robotics and founder of DIY Drones, discovered in 2007 when he crashed a drone bearing a camera into a tree on the grounds of Lawrence Berkeley National Laboratory, there is plenty of trouble in all sorts of places for drone operators to get into.

As one of the map’s authors, Bobby Sudekum, writes on the Mapbox blog, it’s a work in progress. They’ve made the data they collected available for anyone to use, and if you know of other no-fly zones that aren’t on the map, you can add that data to a public repository they started on GitHub.

For instance, you’ll see on the map below that there isn’t a no-fly area over Berkeley Lab, which sits in the greyed area in the hills above UC Berkeley. Similarly, there is no zone marked around Lawrence Livermore National Laboratory, one of the country’s two nuclear weapons labs. I have a call into the lab to check on the rules*, but in the meantime, if you have a drone, just know that in 2006, the lab acquired a Gatling gun that has a range of 1 mile and can fire 4,000 rounds a minute.

*UPDATE 5:30 p.m.: Livermore Lab confirms that drones are not allowed. I’m following up with the Department of Energy to see if I can get more specifics and check on other facilities.

Check out the interactive version at Mapbox.

drone-map-bay-area Bobby Sudekum / Amy Lee / Mapbox


View the original article here

Wednesday, July 23, 2014

A New Nike Baseball Glove That Comes Broken In


Back in the day, breaking in your baseball glove was a summer rite of passage. There were all sorts of tricks for softening the stiff hunk of leather—you’d stick it in the oven, run it over with a car, stash it under your mattress while you sleep.

Nike is looking to make that process obsolete. The company just came out with the Vapor 360 glove, and Nike says it’s ready to use from the moment you put it on. “It was always one of our top priorities that this glove would be game ready out of the box,” says Matthew Hudson, a senior designer at Nike. “We all agreed that the break in process of a regular glove was a problem we needed to solve.”

Full leather gloves, while durable and effective once broken in, are still pretty heavy. Nike wanted to reduce the effort a player exerts by lightening the entire glove and replacing the bulk of leather with more synthetic, reactive materials. To do this, the designers looked to Nike’s footwear division for inspiration.

4MASTER RENDER3 Nike

The glove uses the same heat-welding technology that you find in Nike’s running and basketball shoes—which Nike dubs “Hyperfuse.” That process allows materials to joined up, Frankenstein-like, so that each bit of fabric can be selected for whatever’s most important at its own particular location. Thus, the panel of the glove is nearly seamless and made of lightweight materials. The palm is still made of leather. Look closely and you’ll see little perforations, which accelerate the breaking process to the point of it being ready to use right away.

Nike’s innovation kitchen, the branch of the company working on its more forward-thinking technologies, has been investigating how to make a baseball glove with no leather. They’ve been trying to find a synthetic material that mimics the molding capabilities of leather but in a lighter form, but they realized it’s still hard to beat leather for the palm portion of the glove. “Until we find a synthetic that can mold like a natural skin, leather is superior,” says Hudson.

Nike’s previous baseball glove weighs in at around 680 grams; this one is around 585 grams. “You can feel that difference straight away,” says Jeremy Hewitt, a product line manager at Nike. Lighter might mean faster performance, but it makes you wonder, how durable is this glove anyway? “It’s not an heirloom, it’s a performance machine that you pull out of the box and it’s ready to go right off the lot,” says Hewitt. “We like to say it’s not the Harley Davidson, it’s the Ducati.”


View the original article here

Have a Drone? Check This Map Before You Fly It

Screen Shot 2014-07-23 at 11.53.19 AM Bobby Sudekum / Amy Lee / Mapbox

The popularity of drones is climbing quickly among companies, governments and citizens alike. But the rules surrounding where, when and why you can fly an unmanned aerial vehicle aren’t very clear. The FAA has tried to assert control and insist on licensing for all drone operators, while drone pilots and some legal experts claim drones do not fall under the FAA’s purview. The uncertainty—and recent attempts by the FAA to fine a drone pilot and ground a search and rescue organization—has UAV operators nervous.

To help with the question of where it is legal to fly a drone, Mapbox has put together an interactive map of all the no-fly zones for UAVs they could find. Most of the red zones on the map are near airports, military sites and national parks. But as WIRED’s former Editor-in-Chief, Chris Anderson, now CEO of 3-D Robotics and founder of DIY Drones, discovered in 2007 when he crashed a drone bearing a camera into a tree on the grounds of Lawrence Berkeley National Laboratory, there is plenty of trouble in all sorts of places for drone operators to get into.

As one of the map’s authors, Bobby Sudekum, writes on the Mapbox blog, it’s a work in progress. They’ve made the data they collected available for anyone to use, and if you know of other no-fly zones that aren’t on the map, you can add that data to a public repository they started on GitHub.

For instance, you’ll see on the map below that there isn’t a no-fly area over Berkeley Lab, which sits in the greyed area in the hills above UC Berkeley. Similarly, there is no zone marked around Lawrence Livermore National Laboratory, one of the country’s two nuclear weapons labs. I have a call into the lab to check on the rules*, but in the meantime, if you have a drone, just know that in 2006, the lab acquired a Gatling gun that has a range of 1 mile and can fire 4,000 rounds a minute.

*UPDATE 5:30 p.m.: Livermore Lab confirms that drones are not allowed. I’m following up with the Department of Energy to see if I can get more specifics and check on other facilities.

Check out the interactive version at Mapbox.

drone-map-bay-area Bobby Sudekum / Amy Lee / Mapbox


View the original article here

Shape-Shifting Wings, From Soviet War Planes to Top Gun’s Tomcat

As a jet approaches supersonic speeds, air compressing at the front of the wing causes drag. Engineers figured out that sweeping the wing back can disrupt this drag effect. However, at low speeds this sweep causes air to travel along the wing (from root to tip) instead of over it. This makes jets stall, and stalling makes jets crash. Jets with a mechanically adjustable wing angle, like the F-14, solved this problem. CDR David Baranek/US Navy

As a jet approaches supersonic speeds, air compressing at the front of the wing causes drag. Engineers figured out that sweeping the wing back can disrupt this drag effect. However, at low speeds this sweep causes air to travel along the wing (from root to tip) instead of over it. This makes jets stall, and stalling makes jets crash. Jets with a mechanically adjustable wing angle, like the F-14, solved this problem. CDR David Baranek/US Navy

Take off. Rise. Soar. Bank. Turn. Stall. Swoop. Dive. Land.

For each of the different kinds of flying an airplane has to do, there’s an ideal shape and configuration for its wings. Even though bird-like flappability isn’t feasible with struts and steel, engineers since the dawn of aviation have been trying to make wings that change shape.

Sometimes the morphing seems inevitable, like the wing flaps that most planes use to steer. Other cases, like the airfoils that sweep back to a vee and carry our fighter jets to supersonic speeds, could only have come from trial-and-error arms races. And then there are the oddballs, the telescoping, twisting, torquing shapeshifters.

Any engineer can tell you that solving one challenge often means introducing others, sometimes catastrophic. For every morphing wing design that makes it into the aeronautics canon, there are dozens of others that survive only in footnotes, photographs, or the graveyard.


View the original article here

They’re Here: Massive Mayfly Emergence in Wisconsin

atm with mayflies How badly do you want your money? Mayflies cover an ATM in La Crosse, Wisconsin. July 20th,2014. National Weather Service/NOAA

Earlier this year I wrote about the annual emergence of Lake Flies in Wisconsin. But wait, there’s more! More insects, that is. This week Wisconsin-ites were treated to a mayfly emergence. Just how many mayflies are there? Enough that they show up on weather radar:

“The Mississippi River produced a massive radar echo as mayflies emerged from the water and became airborne. The mayflies were detectable on radar around 8:45 pm…The radar loop below shows the reflected radar energy (reflectivity) from 8:35 pm to just after midnight. The higher the values (greens to yellows) indicate greater concentrations of flies.”

mayfly emergence gif July 20th, 2014: Night of the Mayflies. National Weather Service/NOAA

Because they coat surfaces in an epic biblical plague, mayflies sometimes cause highway accidents. The slippery goo created by millions of mayflies is blamed in a three-car pileup in Hager City, WI yesterday night. A major emergence in 2012 resulted in snow plows being called out to clean up the mess.

Adult mayflies are basically gonads with wings. They don’t eat; some species don’t even have working mouthparts. They are not interested in your puny roads and ATM money machines, humans. They need to get busy gettin’ it on in a massive mayfly orgy. (Also, male mayflies have two penises. I thought you’d want to know.)

The mayfly species emerging this week lives only one night as an adult. They mate, the male dies after ejaculation, and the female feebly flutters off to lay her eggs. The name for this group of insects is Ephemeroptera, from the Greek word for “short-lived” or “ephemeral”.

It’s not actually a “live fast, die young in a blaze of sexual glory” story. Mayflies spend the majority of their life underwater, quietly eating algae and plant material. The full growth cycle of a mayfly can take up to 4 years; we just notice them when they pile up in post-coital exhaustion.

Mayflies emerge synchronously around dusk to avoid their main above-water predators: birds and bats. Predators trying to capitalize on a sudden mayfly all-you-can-eat buffet are overwhelmed by the emergence of millions of insects. Some individuals make it through, and the species continues.

Mayfly larvae are delightfully called “naiads,” and provide critical food for fish. The bodies of immature mayflies have beautiful external gills; this is also why they are important in assessing water quality. Mucky, polluted water is not a place a mayfly larva can breathe. Detroit has had mass mayfly emergences in the past, but a toxic algal bloom in Lake Erie this year is damaging populations of all the animals in the watershed.

This video shows the last minute of a female mayfly’s life; she flops on the water in exhaustion and dumps a massive load of eggs into the water. Within seconds, her eggs hatch and the naiads begin to swim.

Mayflies are some of the most ancient insects around; they are well represented in Carboniferous fossils dating >300 million years ago. Fossil mayflies look remarkably like our modern mayflies; some consider them “living fossils.” The oldest fossil of a winged insect is a mayfly.

This dance of death and birth has been going on for a long time; try to focus on the wonder, rather than the gross out. The protein in mayfly bodies may have powered the rise of the reptiles. We can share some space for a few days.

Homepage image: Richard Bartz via Wikimedia Commons


View the original article here

Bats Use Polarized Light to Set Their Internal Compasses

Greater mouse-eared bat, Myotis myotis, from Bulgaria. Greater mouse-eared bat, Myotis myotis, from Bulgaria. Stefan Greif

Although bats are known for using echolocation to orient and navigate, they draw on a suite of senses to get around. A new study reveals another ability: Bats use patterns of polarized light in the sky to navigate.

Richard Holland and Stefan Greif, of Queen’s University Belfast, with Ivailo Borissov and Yossi Yovel of the University of Tel Aviv, showed that female greater mouse-eared bats use the way the sun’s light is scattered in the atmosphere at sunset to calibrate their internal magnetic compasses. The study, which is the first to show this ability in a mammal, is published in Nature Communications.

When sunlight, which radiates in all directions, is scattered by the Earth’s atmosphere, it becomes directional. Polarization patterns depend on where the sun is in the sky. They’re most evident at sunrise and sunset, in a strip across the sky 90° from the position of the sun.

A range of animals use the pattern of polarization as an orientation cue, such as many invertebrates, plus some birds, fish, reptiles, and amphibians.

Greater mouse-eared bat, Myotis myotis, from Bulgaria. Greater mouse-eared bat, Myotis myotis, from Bulgaria. Stefan Greif

“We had already demonstrated that bats used a magnetic compass that was calibrated by cues observed at sunset,” says Holland. “The question was, what cues? It was known that birds calibrate the magnetic field with the pattern of polarization at sunset, so we tried the same for bats.”

Holland and his colleagues exposed 70 greater mouse-eared bats to one of two different types of polarization patterns at sunset. They placed the bats in experimental boxes equipped with filters to manipulate the pattern of polarization. Then the researchers took the bats to one of two release sites about 12 to 15 miles from their home roost. They released the bats at 1:00 AM, when no polarization pattern was visible, and tracked where the bats flew using small radio transmitters on the animals’ backs.

Bats that had observed a pattern of polarized light shifted 90° headed off in a direction at right angles from the bats that observed a natural pattern of polarized light. “By shifting the pattern 90°, we shifted the bats’ orientation after sunset (when only the magnetic compass was available) either 90° east or west of the control bats,” Holland says. “It shows that the bats calibrate the magnetic field with the pattern of polarization at sunset.”

Bats watch the sunset from their experimental box. Bats watch the sunset from their experimental box. Stefan Greif

Though the behavioral evidence clearly shows bats use polarized light, it’s not yet known how they detect it. Insects such as bees have specially adapted photoreceptors in their eyes. The picture isn’t as clear in vertebrates that perceive polarized light, but the ability might be related to the structure of their cone cells. As far as bats go, the question of how they detect polarized light is wide open.

Bats likely use many senses, including sight, sound, and the Earth’s magnetic field,  to navigate.

“We know that bats can use echolocation and vision for navigation when they are in a familiar place or can see familiar cues,” says Holland. “But outside this range the ‘map and compass’ mechanism comes into play, where the animal determines its position and then takes up the compass direction it needs to head in to reach its goal.”

Reference:

Greif, S., Borissov, I., Yovel, Y., and Holland, R. A. (2014). A functional role of the sky’s polarization pattern for orientation in the greater mouse-eared bat. Nature Communications, published 22nd July 2014. doi: 10.1038/ncomms5488.


View the original article here

Tuesday, July 22, 2014

Science and Sensationalism: Is Japan’s Fuji in a “Critical State” for an Eruption?

Sometimes, it’s the sales pitch that gets you rather than the actual car. That seems to be the case with the latest rash of media coverage over the “critical state” at Japan’s Fuji. You read the news coverage and you’d think that Fuji will erupt any second now, all thanks to the 2011 Tohoku earthquake that struck off the coast of Japan. Now, I wouldn’t blame you if you got that message — it is exactly what press releases and quotes from the authors make it seem is the case. Dr. Frolent Brenguier, the lead author on a new study that appeared in Science, was quoted as saying “All we can say is that Mount Fuji is now in a state of pressure, which means it displays a high potential for eruption. The risk is clearly higher.” Seems pretty straight forward, doesn’t it? Their research must clearly show that Fuji is now in a state ready to erupt and we know that from some sort of pressure measurement.

Now, it is hard for me to blame the media for not going back and carefully reading the Science article to see if their data supports such grandiose claims. You have to believe that if a paper is published in Science, then it is supported by verifiable data — and for the most part, they are. Like any reputable journal, Science is rigorously peer-reviewed before any article is published. Now, big name journals like Nature and Science do fall prey to a significant trap. Not only do they want what they see as quality scientific research, but they also want it to be flashy. So, you might have done the best study ever on the eruptive history of Mt. X, but Nature and Science wouldn’t touch it unless you can make it flashy: Is Mt. X a “supervolcano”? Did it change global climate? Will it destroy us all in the future? In a sense, Nature and Science are the Hollywood of science publications — they want the big tentpole papers. That is where the danger lies, especially if you’re trying to get media coverage. How far can you push the interpretations, possibly even from outside the paper itself, to get the attention you desire?

Back to the Brenguier and others (2014) study on Mt. Fuji. They examined how the state of pressure in the crust across Japan changed after the massive M9 Tohoku earthquake in 2011. That earthquake released a massive amount of energy, and although it relieved stress near its epicenter, it likely caused stress in the crust to increase in other places as that energy was displaced. By examining how quickly seismic waves move through the crust (which is partially controlled by the state of stress in the crust), they could see where new stress has accumulated. They argue that places with the largest velocity reduction after the Tohoku earthquake are the places where the crust is feeling low effective pressure. This low effective pressure is caused by pressurized fluids, like magma or hydrothermal fluids (i.e., water), in the crust pushing outward on the rocks.

Figure 2 from Brenguier and others (2014) showing the change in seismic velocity across Japan after the M9 2011 Tohoku earthquake. Figure 2 from Brenguier and others (2014) showing the change in seismic velocity across Japan after the M9 2011 Tohoku earthquake. Brenguier and others (2014), Science.

Not surprisingly, the places that saw the largest velocity reduction were places underneath all the active volcanoes across Japan (see right). In contrast, the smallest reduction occurred in places with rigid rocks, like granite. This change in seismic velocity is tiny — even in the areas with the largest change, it was only by ~0.12%. Now, this is where it gets tricky. They state: “The seismic velocity susceptibility to stress can be used as a proxy to the level of pressurization of the hydrothermal and/or magmatic fluids in volcanic areas.” This means that anywhere that either hydrothermal or magmatic fluids are present can experience the large drop in seismic velocities. So, you can measure changes in seismic velocity to understand changes in pressurization of the crust — such as when new magma is intruding or hydrothermal fluids are moving through the crust.

In my mind, that is their key conclusion. It does not mean that the Tohoku earthquake caused the pressurization of the area as such. Rather, that changes in seismic velocity after the earthquake can tell us something about the state of pressurization in the crust. They do go on to say that an earthquake occurred 4 days after the Tohoku temblor, and it happened to be near Fuji (which hasn’t erupted since 1707, making people worry it’s “overdue” — it isn’t), but this correlation is not a piece of their evidence for their conclusion, but rather their way of trying to say the crust was prone to new earthquakes already and Tohoku triggered it. This is a bit of a stretch without further research to support this triggering.

The one thing they never say in the paper is that Fuji is more likely to erupt thanks to the Tohoku earthquake. Never. Not once.

So, why is this the message that we’re being fed in the news? Well, it’s thanks to the authors deciding that a conclusion that is outside their paper is the one that most media-ready. Would the media be all over a study that made the bold claim that changes in seismic velocities can tell us a little something about the state of pressure in the crust? I would venture to say no. Now, if you then say the change after the earthquake puts a big volcano in Japan – a national icon — into a “critical state” that could mean an eruption will occur soon? Stop the presses! Yet, this isn’t the conclusion of the actual Science article at all. I have no way of knowing, but this external “conclusion” about Fuji could have been originally included but was removed in the process of peer-review. I mean, we’ve seen this idea before — that a certain increase in pressure mean Fuji will erupt – but it has never really been shown to be verifiable. We’re actually stuck in a chicken-and-egg loop here: Did the earthquake tell us that pressure is high enough for an eruption (that was going to happen anyway), or did the earthquake add more pressure and make an eruption more likely? Fuji is a dangerous (yet wonderful) volcano, as is any volcano near large population centers, so understanding its behavior and planning for an eruption is important.

This is not to say that the science in the Brenguier and others (2014) article isn’t good science. From what I can tell, it is. However, there is a fine line in my mind between promoting your work and going all P.T. Barnum on everyone. Maybe the quotes were taken out of context (although it seems unlikely). Without understanding what actually triggers an eruption at Fuji (or any volcano for that matter) and without knowing whether the pressure in the crust in these volcanoes is due to magma or hydrothermal fluids, it is definitely a stretch to say that “the risk is clearly higher.” However, it does make much splashier press to lead with “Fuji could erupt” over “seismic waves changed velocity.”


View the original article here

Hacker Musician Turns E-Waste Into an Awesome Instrument

The hacker with his hybrid synth. The CD drives emit sound when the CDs are spun. The hacker with his hybrid synth. The CD drives emit sound when the CD’s are spun. Electronic Waste Orchtestra

We tend to think of musical instruments in fixed terms: that’s a guitar, this is a saxophone, that’s a synthesizer. Colten Jackson, however, plays an instrument that’s hard to classify. The Illinois musician hacked together what he calls the Hard Rock Guitar out of e-waste: six obsolete hard drives, and an old keyboard number pad, powered by an Arduino board. At Jackson’s command, it emits a range of synthy, ambient tones. If he wants to change the notes or scales, he need only tinker with the software. “Instruments are this free-form art; they just have to make sound,” he says. “Whatever you start with, whether it’s garbage or e-waste, it lends itself to something.”

design_disrupt

Jackson is an active member of the open source Makerspace lab in Urbana, Illinois, and the founder of an affiliated project called Electronic Waste Orchestra. The Orchestra—not technically an orchestra, it consists of a bi-monthly meet up and a four-week summer camp for kids—began after Jackson read a Hackaday article about turning old hard drives into synthesizers. When the disc platter in a hard drive spins, it produces sound waves, which an Arduino can measure and then transmit to sound-generating software. The Makerspace lab had plenty of these unwanted drives laying around, so Jackson powered them up and pieced together the Hard Rock Guitar. A number pad is affixed to the top, where each key gets assigned a chord or a note, depending on how the player programs the software.

The Hard Rock Guitar is Jackson’s inaugural instrument (and “the only one you could stand on stage with,” he says), but he’s tinkered around with a handful of other creations like circuit boards strung together, and an old push-button telephone plugged into batteries. The beauty of e-waste instruments, he says, is that virtually anything meant to be electrically powered can yield music. “You look at the thing and think about how it works: if it has an amplifier, and an interesting touch pad that sends out data somehow,” he says. One notable exception? Newer circuit boards. “The parts are so small,” Jackson says. “These days there’s six layers on top of each other sealed with resin, so it’s hard to get in there.”

E-waste is a chronic environmental problem with fragmented solutions. A reported 50 million tons of obsolete electronics were thrown away last year, and ideas for shrinking that number run from straightforward recycling programs to turning old gadgets into gold. Jackson isn’t pretending to solve for that with the Electronic Waste Orchestra. Yes, it offers up one way for people to recycle their own unused electronics, but at its core the Orchestra is about a new approach to designing music.

Because the actual music is created within a software program, it would be easy to lump Jackson’s work into with the rest of electronic music. Instead, the Electronic Waste Orchestra is more philosophically akin the improvisational style of jazz music. Part of the randomness comes from learning to work with unwanted materials: “It introduces constraints to design,” Jackson says. “If you were to try to design a digital instrument from scratch, I think it’d be hard to come up with an idea on a blank canvas, but dig through a pile of e-waste and find a circuit board, and now you have an interface. When you have this new form to play with, it’ll be a new kind of music.”


View the original article here