Archives for April 2018

'Westworld' Recap, Season 2 Episode 2: The Façade Is Crumbling

Fellow watchers of Westworld, we have cracked the façade.

The second episode of Season 2 opens on Dolores’ (Evan Rachel Wood) face. Bernard (Jeffrey Wright) asks if she knows where she is; she guesses she is in a dream. He corrects her: “No, you’re in our world.” The camera pulls back to reveal them seated at a window of a high rise, looking down on the sparkling lights of a metropolis at night.

Holy smokes! The outside world! And Dolores, dressed in a black cocktail dress and heels—what’s she doing outside the park?!

For so long, Westworld focused so much of its energy on the dramas of that dusty park that it was easy for viewers to forget the world beyond. That, of course, is exactly the point of Westworld: to be a place where people can unshackle themselves from reality and its pesky social mores. It’s a safe space, where visitors are told no one is watching and they can find out who they really are in a wonderland with no consequences. But for those watching at home, it often looked like there was no place beyond the park—no repercussions for Westworld’s visitors or its creators.

Related Stories

But the show has dropped reminders that there is a world beyond the park’s borders, even though it only gave the barest hints as to where it’s located. Or if there are other parks. There are; animals from those other parks, viewers now know, wander into Westworld. It seems to be on an island. (Apologies if that sentence induced Lost flashbacks.) And there are those mysterious Chinese-speaking characters, who appeared as members of a military last episode and as businesspeople at the Mesa Hub in Season 1. Now the boundaries separating the inside and the outside are shattering. Plus, we already know that the guests are being watched, and their data is being wielded for some greater commercial purpose.

Staring out the glass window, Dolores doesn’t seem to know any of this. Marveling at the city lights, she says “it looks like the stars have been scattered across the ground.” In the background, we hear Ford’s voice. “Arnold,” he calls. Ah—so it’s Arnold, not Bernard, sitting with Dolores. We’re in the deep past.

Ford and Arnold discuss whether Dolores is “ready.” Arnold insists she is not, and Ford chides him for playing favorites and protecting her, but they agree to “go with the other girl.” Arnold returns to the window and looks at Dolores with tenderness.

He takes her for a walk in the streets, which appear to be in an Asian, likely Chinese, city. They enter an unfinished compound and tour its rooms. Arnold explains that he is moving his family here, so they can be closer to his work. On a balcony, they fall deep in conversation, and Arnold is struck by her wisdom. Then she snaps into a loop: “It looks like the stars have been scattered across the ground.” Arnold’s gaze hardens and he turns away. She’s just another robot after all.

This is surely the humans’ greatest folly, their inability to look past the droids’ occasional limitations to treat them with dignity. That Arnold, a witness to Dolores’ surprising sagacity, can write her off in a heartbeat reveals his all-too-human limitations. The hosts are outsiders, and humans are nothing if not tribal. It is perhaps our own most deeply programmed loop. Dolores slips into a loop, and in response Arnold slips into his, mentally kicking her out of the tribe.

But with her memory intact, rebellion-era Dolores is charged with power. She’s been in the outside world. Through her roles as Arnold’s and William’s favorite bot, she knows more about the inner workings of Westworld than most of the humans working at the park.

This point comes to the fore when she, Teddy (James Marsden), and their small band of supporters storm into a host maintenance lab in the thick of the rebellion. Fueled with rage, they start bullying the lab techs. As they dunk a lab tech’s head in a vat of white body-printing goo, Dolores asks, “Do you even know what you’re guarding here, the real purpose?” “You don’t know, do you?” she continues. “But I do.” Her wealth of knowledge vaults her ahead of the hapless employees.

She’s entered the lab with one goal: to accrue an army. Her best bet, she decides, is to commandeer the Confederados still out roaming the wilderness. She finds a perished Confederado slumped against a wall and pressures the lab tech into reactivating him. That lab tech is suddenly very useful. He’s health insurance. Along with the Confederado, they bring him out into the park as their personal medic.

They track down the Confederados and try to broker a deal. But you can’t just sweet-talk soldiers, so this ends as you might expect: in violence. Dolores and her gang slaughter the lot of them, then use the lab tech to resurrect first their commander, then the others. The flabbergasted commander falls in line, and the Confederados join her cause.

But the audience hasn’t been given its last glimpse of the outside world. We jump to the past, to a moment when Logan Delos (Ben Barnes) and William (Jimmi Simpson) are sipping drinks at a swanky bar. Two strangers, a slick-looking man and a standard-issue hottie, approach with a business proposition. “Everyone is rushing to build the virtual world. We’re offering something a little more tangible,” one of them announces. They invite Logan to a cocktail party where he can learn more about the investment they’re pitching. At the party, Logan is at first impatient—until he grasps what is happening. One of these impeccable humans, he realizes, is not human at all. “That… is… delicious,” he says in amazement.

Logan works the room, sizing up each guest’s humanity. The moment is electric. We see the room through his eyes. None of the faces are familiar. Everyone is beautiful, suave, inscrutable. He determines that the robot in the room must be his host, the standard-issue hottie. Instantly everyone freezes, except for her. Logan is hooked.

Yet Logan’s investment in Westworld has always rankled his father, James Delos, a titan of business. And it’s William, not Logan, who eventually convinces James that his son’s folly is in fact a windfall. William brings James (Peter Mullan) to Sweetwater, where Dolores is once again packing up her horse’s saddlebag and dropping her infernal can. The scene freezes. We see James for the first time. He’s griping about Logan’s infatuation with this frivolous place, a park where nothing is real. William agrees that nothing is real, except for one thing: the guests. “No one is watching,” William says. “Or so we tell them. It’s the only place in the world where you can see people for who they really are.” They take a walk, and William explains out of earshot his idea for a business model.

Their story picks up a few years later, at James’s retirement party. William is there with his wife and young daughter, ready to assume James’ mantle. There, too, is Dolores, dressed in white and playing the piano. She catches sight of William and stares at him at length.

She goes outside to look at the night sky. Reclining on a lawn chair behind her, half out of sight, is Logan, inebriated and injecting a drug into his arm. He’s cursing the partygoers, calling them fools for fiddling while they set the entire species on fire. Callous, impetuous Logan is suddenly the lone voice of reason.

We flash to the future—back to the wilds of the park and the rebellion, this time to the Man in Black (Ed Harris) and his host sidekick Lawrence (Clifton Collins Jr.), who are deep in conversation. He explains to Lawrence why Westworld exists: “They wanted a place hidden from God, a place they could sin in peace.” Except there’s more. “But we were watching that. We were tallying up all their sins, all their choices. Of course, judgment wasn’t the point. We had something else in mind entirely.” He tells Lawrence he plans to escape the park and then burn it down. But to do that, they’ll need help, so Lawrence leads him to Pariah, the town of decadence and depravity from Season 1. But Pariah appears to have been decimated. The ground is littered with bodies, and mice skitter through an abandoned banquet.

Suddenly a group of figures arises from among the bodies, encircling the Man in Black and Lawrence, their guns drawn. Seated before them is none other than El Lazo—the outlaw leader who, in earlier episodes, had been Lawrence himself and is now played by a different host. The Man in Black grabs him and points a gun to his head, demanding that the gathered gang of outlaws join his cause.

“This game was meant for you, but you must play it alone,” El Lazo says. Suddenly the bandits all turn their guns on their own heads and collapse in a heap. El Lazo grabs the trigger of the Man in Black’s gun and shoots himself. The Man in Black curses but pulls himself together. “I built this place we’re going, and it’s my greatest mistake,” he tells Lawrence.

The episode jumps to Dolores, who is seated in a host examination room. “Bring yourself back online, Dolores,” says a voice. This time it’s William. It’s the first time we’ve seen him in the lab facilities of the park. He marvels at how ridiculous it was for him to fall in love with her, a mere thing. “You don’t make me interested in you, you make me interested in me,” he tells her. He adds that everyone loves staring at their own reflection. Then he says cryptically, “I think there’s an answer to a question no one has ever dreamed of asking. Do you want to see?” In the next scene, William and Dolores are out in the wilderness, looking down at a canyon getting carved out by bulldozers.

It’s seemingly this moment that Dolores recalls when we flash back to the rebellion. She’s with Teddy and the Confederados. They’re aiming for a town—some hosts call it Glory, others The Valley Beyond. “It doesn’t matter what you call it, I know what we’re going to find there,” Dolores says. “It’s not a place, it’s a weapon, and I’m going to use it to destroy them.”

If the Man in Black and Dolores are headed to the same place, this giant pit—or rather, whatever it becomes—seems like it will be the stage for an epic showdown. The role this place, this weapon, as Dolores calls it, will play in determining the park’s fate is a tantalizing question.

Yet the shattering of the illusion that Westworld is the center of action is the true legacy of this episode. The hosts have visited our cities. Perhaps some of them wander among us. What defines the park, and what is the outside world? The answer is no longer clear.

More WIRED Culture

  • The inside story of Pong and Nolan Bushnell’s early days at Atari
  • What does “self-care” mean amid the barrage of news and social media?
  • The strange history of one of the Internet’s first viral videos

Twitter Sold Data Access to the Researcher at the Center of Facebook’s Cambridge Analytica Scandal

Twitter sold data access to the Cambridge University academic who also obtained millions of Facebook users’ information that was later passed to a political consulting firm without the users’ consent.

Aleksandr Kogan, who created a personality quiz on Facebook to harvest information later used by Cambridge Analytica, established his own commercial enterprise, Global Science Research (GSR). That firm was granted access to large-scale public Twitter data, covering months of posts, for one day in 2015, according to Twitter.

“In 2015, GSR did have one-time API access to a random sample of public tweets from a five-month period from December 2014 to April 2015,” Twitter said in a statement to Bloomberg. “Based on the recent reports, we conducted our own internal review and did not find any access to private data about people who use Twitter.”

The company has removed Cambridge Analytica and affiliated entities as advertisers. Twitter (twtr) said GSR paid for the access; it provided no further details.

Explanations Needed

Twitter provides certain companies, developers and users with access to public data through its application programming interfaces (APIs), or software that requests and delivers information. The company sells the data to organizations, which often use them to analyze events, sentiment or customer service.

Enterprise customers are given the broadest data access, which includes the last 30 days of tweets or access to tweets from as far back as 2006. To get that access, the customers must explain how they plan to use the data, and who the end users will be.

Twitter doesn’t sell private direct messaging data, and users must opt in to have their tweets include a location. Twitter’s “data licensing and other revenue” grew about 20%, to $90 million, in the first quarter.

Social media companies have come under intense scrutiny over reports that Facebook failed to protect the privacy of its users. Companies like Twitter tend to have access to less private information than Facebook. The latter has said that Cambridge Analytica, which worked for President Donald Trump’s 2016 campaign, may have harvested data on 87 million users.

Personality Quiz

About 270,000 people downloaded Kogan’s personality quiz app, which shared information the people and their friends that was then improperly passed to Cambridge Analytica. Facebook (fb) Chief Executive Officer Mark Zuckerberg has testified in front of Congress about the misuse of data, and lawmakers have called on Twitter CEO Jack Dorsey and Google CEO Sundar Pichai to testify as well.

Criticism of Twitter’s failure to prevent misinformation and abuse on its platform has risen since the 2016 election. In the first quarter, the company removed more than 142,000 applications connected to the Twitter API that was collectively responsible for more than 130 million “low-quality” tweets during the period. The company has also limited the ability of users to perform coordinated actions across multiple accounts.

Bloomberg LP produces TicToc, a global breaking news network for the Twitter service.

AI Can Help Cybersecurity—If It Can Fight Through the Hype

Walking the enormous exhibition halls at the recent RSA security conference in San Francisco, you could have easily gotten the impression that digital defense was a solved problem. Amidst branded t-shirts and water bottles, each booth hawked software and hardware that promised impenetrable defenses and peace of mind. The breakthrough powering these new panaceas? Artificial intelligence that, the sales pitch invariably goes, can instantly spot any malware on a network, guide incident response, and detect intrusions before they start.

That rosy view of what AI can deliver isn’t entirely wrong. But what next-generation techniques actually do is more muddled and incremental than marketers would want to admit. Fortunately, researchers developing new defenses at companies and in academia largely agree on both the potential benefits and challenges. And it starts with getting some terminology straight.

“I actually don’t think a lot of these companies are using artificial intelligence. It’s really training machine learning,” says Marcin Kleczynski, CEO of the cybersecurity defense firm Malwarebytes, which promoted its own machine learning threat detection software at RSA. “It’s misleading in some ways to call it AI, and it confuses the hell out of customers.”

Rise of the Machines

The machine learning algorithms security companies deploy generally train on large data sets to “learn” what to watch out for on networks and how to react to different situations. Unlike an artificially intelligent system, most of the security applications out there can’t extrapolate new conclusions without new training data.

Machine learning is powerful in its own right, though, and approach is a natural fit for antivirus defense and malware scanning. For decades AV has been signature-based, meaning that security companies identify specific malicious programs, extract a sort of unique fingerprint for each of them, and then monitor customer devices to ensure that none of those signatures appear.

Machine learning-based malware scanning works in a somewhat similar manner—the algorithms train on vast catalogues of malicious programs to learn what to look for. But the ML approach has the added benefit of flexibility, because the scanning tool has learned to look for characteristics of malware rather than specific signatures. Where attackers could stymie traditional AV by making just slight alterations to their malicious tools that would throw off the signature, machine learning-based scanners, offered by pretty much all the big names in security at this point, are more versatile. They still need regular updates with new training data, but their more holistic view makes a hacker’s job harder.

“The nature of malware constantly evolves, so the people who write signatures for specific families of malware have a huge challenge,” says Phil Roth, a data scientist at the machine learning security firm Endgame, that has its own ML-driven malware scanner for Windows systems. With an ML-based approach, “the model you train definitely needs to reflect the newest things that are out there, but we can go on a little bit of a slower pace. Attackers often build on old frameworks or use code that already exists, because if you write malware from scratch it’s a lot of effort for an attack that might not have a large payoff. So you can learn from all the techniques that exist in your training set, and then recognize patterns when attackers come out with something that’s only slightly new.”

Similarly, machine learning has become indispensable in the fights against spam and phishing. Elie Bursztein, who leads the anti-abuse research team at Google, notes that Gmail has used machine learning techniques to filter emails since its launch 18 years ago. But as attack strategies have evolved and phishing schemes have become more pernicious, Gmail and other Google services have needed to adapt to hackers who specifically know how to game them. Whether attackers are setting up fake (but convincing-looking) Google Docs links or tainting a spam filter’s idea of which messages are malicious, Google and other large service providers have increasingly needed to lean on automation and machine learning to keep up.

As a result, Google has found applications for machine learning in almost all of its services, especially through an ML technique known as deep learning, which allows algorithms to do more independent adjustments and self-regulation as they train and evolve. “Before we were in a world where the more data you had the more problems you had,” Bursztein says. “Now with deep learning, the more data the better. We are preventing violent images, scanning comments, detecting phishing and malware in the Play Store. We use it to detect fraudulent payments, we use it for protecting our cloud, and detecting compromised computers. It’s everywhere.”

At its core, machine learning’s biggest strength in security is training to understand what is “baseline” or “normal” for a system, and then flagging anything unusual for human review. This concept applies to all sorts of ML-assisted threat detection, but researchers say that the machine learning-human interplay is the crucial strength of the techniques. In 2016, IBM estimated that an average organization deals with over 200,000 security events per day.

Machine learning’s most common role, then, is additive. It acts as a sentry, rather than a cure-all.

“It’s like there’s a machine learning assistant that has seen this before sitting next to the analyst,” says Koos Lodewijkx, vice president and chief technology officer of security operations and response at IBM Security. The team at IBM has increasingly leaned on its Watson computing platform for these “knowledge consolidation” tasks and other areas of threat detection. “A lot of work that’s happening in a security operation center today is routine or repetitive, so what if we can automate some of that using machine learning or just make it easier for the analyst?” Lodewijkx says.

The Best Offense

Though many machine learning tools have already shown promising results in providing defense, researchers almost unanimously warn about the ways attackers have begun to adopt machine learning techniques themselves. And more of these types of attacks are on the horizon. Examples already exist in the wild, like hacking tools that use machine vision to defeat Captchas.

Another present threat to machine learning is data poisoning. If attackers can figure out how an algorithm is set up, or where it draws its training data from, they can figure out ways to introduce misleading data that builds a counter-narrative about what content or traffic is legitimate versus malicious. For example, attackers may run campaigns on thousands of accounts to mark malicious messages or comments as “Not Spam” in an attempt to skew an algorithm’s perspective.

In another example, researchers from the cloud security firm Cyxtera built a machine learning-based phishing attack generator that trained on more than 100 million particularly effective historic attacks to optimize and automatically generate effective scam links and emails. “An average phishing attacker will bypass an AI-based detection system 0.3 percent of the time, but by using AI this ‘attacker’ was able to bypass the system more than 15 percent of the time,” says Alejandro Correa Bahnsen, Cyxtera’s vice president of research. “And we wanted to be as close as possible to how an actual attacker would build this. All the data was data that would be available to an attacker. All the libraries were open source.”

Researchers note that this is why it is important that ML systems are set up to encourage “human in the loop,” so systems aren’t sole, autonomous arbiters. ML systems “should have the option to say ‘I have not seen this before’ and ask help from a human,” says Battista Biggio, an assistant professor at the University of Cagliari, Italy, who studies machine learning security. “There’s no real intelligence in there—it’s inferences from data, correlations from data. So people should just be aware that this technology has limitations.”

To this end, the research community has worked to understand how to reduce the blind spots in ML systems so they can be hardened against attacks on those weaknesses. At RSA, researchers from Endgame released an open source threat data training set called EMBER, with the hope that they can set an example, even among competing companies, to focus on collaboration in security ML. “There are good reasons that the security industry doesn’t have as many open data sets,” Endgame’s Roth says. “These kinds of data might have personally identifying information or might give attackers information about what a company’s network architecture looks like. It took a lot of work to sanitize the EMBER dataset, but my hope is to spur more research and get defenders to work together.”

That collaboration may be necessary to stay ahead of attackers using machine learning techniques themselves. There’s real promise behind machine learning in cybersecurity, despite the overwhelming hype. The challenge is keeping expectations in check.

Machine vs Machine

It's Time to Adopt Global Principles to Protect Consumers' Data

Whether it is Cambridge Analytica gaining access to private information on up to 87 million Facebook users, or the large-scale data breaches at Equifax or Yahoo, alarmingly loose standards for the use and protection of customer data continue to fuel a backlash against large tech companies. More importantly, these events demonstrate the need for a global set of consumer data principles.

WIRED OPINION

ABOUT

Kai Keller (@kaimkeller) is a global leadership fellow at the World Economic Forum and leads the organization’s work at the cross-section of innovation and financial stability.

The Facebook-Cambridge Analytica saga has triggered much-needed debates over the necessity for greater regulation and the potential breakups of de facto monopolies. But these debates will lead nowhere if the global community doesn’t manage to tackle the main challenge of how to treat and govern customer data.

The stakes, as the Cambridge Analytica debacle makes clear, are high. In her remarks at the World Economic Forum’s Annual Meeting in Davos this year, German chancellor Angela Merkel linked the data governance question to the very health of democracy itself. “The question ‘who owns that data?’ will decide whether democracy, the participatory social model, and economic prosperity can be combined,” Merkel said.

The challenge is also not small. Every two days we create as much data as we did from the start of time to 2013. Marketing companies have about 1,500 data points on approximately 96 percent of US citizens. Consumers and businesses alike have become accustomed to this amazing growth and availability of customer data without any societal debate establishing what collection, usage, and sharing practices are appropriate or even ethical.

Moreover, it’s not just Big Tech that’s involved. While today the spotlight is on Silicon Valley, and Facebook, Amazon, and Google in particular, the challenge of how to treat customer data affects any company active in the digital sphere, across any industry and any geography.

With all that in mind, the World Economic Forum brought together a group of experts representing technology companies, financial services providers, law firms, trade unions, religious organizations and regulators. We tasked the group to develop a set of global principles for the appropriate use of customer data. Here is what we learned.

First off, a truly global framework is needed to get results. On a national level, more than 100 countries have already passed data protection laws of varying robustness. But data flows show little regard for borders, and digital businesses operate across geographies and jurisdictions, so national standards can ultimately only achieve so much.

Second, there’s a good reason why a global framework is lacking: Sentiments and attitudes towards data collection, usage, and sharing vary significantly among the most data-driven markets of North America, Europe, and Asia. But the Forum’s work in several major jurisdictions has shown that all actors can indeed agree on principles on data control or ownership, data portability, and data security.

So what are the global principles?

Customers should be the ones controlling their data, and companies should need customers’ consent to use it.

Consumers should be able to move their data freely between service providers and allow third parties to manage it. If a customer finds a new platform more compelling than a service she is currently using, the existing service provider should allow her to download her data and not stand in the way of her switching to a competing platform.

Companies should be on the hook when it comes to security or the assignment of liability between companies and customers in case of any breaches.

Companies should comprehensively test and provide justification for artificial intelligence-based models before they hit the market. By design, AI lets machines develop their own logic. But what’s considered good by the computer may not be good for society.

On the surface, these broad principles may seem to limit innovation. Having less data available to businesses means less-accurate profiling capabilities, making companies such as Facebook less desirable to advertisers.

But while some of the principles strengthen the position of consumers at the expense of big data businesses, most principles protect customers while also benefiting providers in the long run.

Allowing customers to move their data from one platform to another enhances competition. This in turn spurs economic growth, and ultimately benefits all—including the large platforms.

Trust lies at the heart of all business models. Every instance of bad conduct erodes customer faith not only in individual companies, but also the broader system. Ultimately, an erosion of confidence leads to unstable systems, the consequences of which we experienced painfully 10 years ago in the global financial crisis.

With a backlash against tech gaining momentum, businesses may find themselves near the brink of crisis sooner than many anticipate. It’s time for a worldwide conversation about what data practices are appropriate. A set of global principles will serve as one, important starting point facilitating this conversation.

WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.

More on Privacy

Macy's Volatility: A Little Underdone Today, But Promising Overall

All data used in this piece came from Finance Yahoo.

Introduction

Chart

M data by YCharts

Macy’s (M) stock has enjoyed a strong run over the past six months or so, after having undergone a painful slide that began in July of 2015, shortly before the broader fallout in the global stock market (SPY, ACWI). Shares touched a low of $17.41 on November 7, 2017, and since then the company has been experiencing a renaissance in investor confidence, as expressed through the rebounding share price.

The reasons investors shift their beliefs about a company’s future are both fascinating and contentious. In any case, the investment community looks to be revisiting just how much pessimism over Macy’s future prospects is warranted.

The endeavor to generate outsized risk-adjusted returns (“alpha”) can be approached in a number of different ways. Some may build discounted cash flows in an attempt to determine intrinsic value, while others look at thematic shifts such as a switch in management, a new product offering, or a change in the competitive environment.

Whatever the approach, our belief is that investors can benefit by understanding the market risks that have accompanied the shares from a statistical standpoint. After all, “risk-adjusted” is the frequent qualifier used in determining whether returns have been adequate.

This article marks the first in a series whose collective focus will be the empirical risks associated with owning Macy’s shares, both on a stand-alone basis and also in relation to peers and benchmarks. Each article will be readable as a stand-alone piece that provides unique perspective to the reader. This piece examines the historical nature of volatility for the shares and considers the current environment in this context.

Historic Rolling Volatility

Volatility, or standard deviation of returns, is an important metric for considering risk among practitioners and academics alike. The availability of a liquid options market enhances the validity of this stance, as risk segments can be bought and sold with ease, and options pricing models rely heavily on the concept of volatility.

We will examine how the path for historic (or “realized”) volatility has unfolded for Macy’s shares. The period under consideration spans from the first trading day of 2007 and continues into the present. All volatilities in this piece are on a rolling basis; we use various look-back periods to acquire different insights.

Above we see a timeline of Macy’s annualized volatility over five- and ten-day periods. If one examines closely, the weekly line (in red) tends to be jerkier than the somewhat less exaggerated bi-weekly volatility measure. Weekly vol for the shares peaked at 240% in late 2008 and has also undergone periodic bouts of renewed price instability despite a general pattern of calming down since the inception of the bull run in equities beginning in March 2009. Many investors likely remember well the painful spike following the earnings disappointment in May 2017 that led to a crash in the shares and a reinvigorated wave of selling pressure. In contrast, today’s levels are more in keeping with the lows of the series.

While shorter-term measures of Macy’s volatility shot up to about 240%, longer-term metrics unfolded along a gentler path; for instance, annual volatility peaks at less than 100% (about one year later).

Note that among monthly, quarterly, and annual measures of realized volatility, the shorter time frames lead to jerkier, more extreme outcomes (not unlike moving averages). This is one reason why traders tend to prefer shorter periods to execute their activities, while investors often prefer longer investment horizons as time tends to smooth volatility.

Taken on the whole, one could reasonably say that Macy’s short-term measures of realized volatility are falling, while longer frames look to be in the process of forming a top. This is an encouraging trend for owners of the shares.

Historical Periodic Volatility

Popular implied volatility measures (such as the CBOE VIX®) tend to be based on annualized forward-looking volatilities, gathered from options prices. While this vantage point helps to create an apples-to-apples comparison across different time frames, we will now take a different perspective: that of periodic, non-annualized measures of volatility.

What you see above is a visualization of periodic rolling volatilities. For each time period, we calculate the periodic vol based on the standard deviation of daily returns over that period. At over 35% the non-annualized volatility in late 2008 was a truly terrifying time to be a Macy’s shareholder. By comparison, today’s near-term vols are printing under 5%.

From a theoretical standpoint, volatility scales with the square root of time. Note above that this pattern appears to hold up fairly well (though not exactly in the cases above due to the fact that different series use different quantities of data).

Investors are frequently tempted to erroneously extrapolate short-term price gyrations too far into the future. For example, consider that the periodic comparison above is not annualized, and yet the ten-day historical volatility of Macy’s shares is rarely much higher than the five-day.

Just as we saw with the annualized vols, there has been a recent downturn for both monthly and quarterly periodic vols. Annual volatility on the other hand has yet to peak as it operates on a lag. Note the frequent leaps in the monthly series that occur around earnings season, in particular for late 2015 through mid 2017.

Box Plot and Percentile Analysis: Periodic Volatilities

In the box plots below, we have incorporated color-coded dots corresponding the lowest 1% and highest 10% of observations for each series, where the colors represent the timing of the observation.

As a reminder on box plots, the bottom of the box indicates the cutoff for the 25th percentile, the top indicates the 75th percentile, and the line in the middle represents the median.

By convention, the length of the whiskers is 1.5 times the range from the 25th to 75th percentiles. However, f the data does not extend to that degree, the whiskers will come to an end at the maximum or minimum values in the data.

It should come as no surprise that the period of 2008 and 2009 serve as the periods from which we see the highest concentration of upper outliers in the box plot series. It is worthy to note that while 2009 contained some of the highest volatility readings, the year also posted some of the best returns for the entire period under consideration.

Also, for the weekly through monthly series, observe that bubble colors corresponding to 2017 show up; investors of the big-box retailer have had to contend with some choppy periods over the last year. One might reasonably ask how dire the company’s prospects are in today’s environment relative to what they looked like almost ten years ago in the midst of the global crisis.

For each of the periods, 2013 through 2015 look to dominate the bottom in the volatility data. With volatility low and the stock price gliding higher, this was a period where it “felt good” to be a Macy’s investor.

Through April 20th

Periodic or annualized vol are simply different perspectives on the same underlying reality. As such the percentiles are the same for each.

While the S&P 500 is presently experiencing a lot of churn relative to its recent past, the opposite pattern emerges for Macy’s: near-term vols such as five- and ten-day are in the low end of their range going back to 2007. In contrast, the stock’s historical vol over the last year was in the top quartile. Recall that some of the lowest readings for volatility, leading into mid 2015, in retrospect spelled trouble for the stock. We mention this only to say that it is easy to get caught up in the current environment, which may be presently characterized as one of complacency.

Wrap-Up

Macy’s is still in the process of re-establishing itself as an attractive offering. As such, volatility measures on the shares have plunged of late. This may in fact signal a lackadaisical posture for investors. However, the present environment, in contrast to the broader market of equities, is that of increased calm and confidence for the stock.

For our part, we suggest that volatility here is a little “under done”, but that the overall trend is very promising.

Chart

M Dividend Yield (TTM) data by YCharts

Investors tend to focus on yield more when the shares are not screaming lower. We believe that today’s 4.75% yield, taken together with lower volatility, creates a backdrop for further support.

Some investors are uncomfortable with the notion of risk as synonymous with volatility. We assert that vol as a statistical measure does in fact carry value for a number of conceptual and informational reasons.

Even so, in the next segment of our analysis of Macy’s, we will turn attention to drawdown analysis, as many consider permanent impairment of capital as the only true risk associated with investment.

Please consider following us.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

American Airlines CEO Just Gave an Incredible Reason Why Fares Will Go Up (Prepare To Be Angry)

Absurdly Driven looks at the world of business with a skeptical eye and a firmly rooted tongue in cheek. 

Airlines have to make tough decisions.

Sometimes, though, those decisions end up being tougher for their passengers, rather than for the airlines themselves.

You might experience an involuntary shudder in several vital parts, therefore, when I tell you that American Airlines‘ CEO Doug Parker says that fares will likely go up “over time.”

In a call with analysts on Thursday, reported by the Associated Press, Parker offered a very simple argument.

Fuel prices have gone up by 40 cents a gallon, so fares will have to go up too — if that’s the way fuel prices keep going.

You might entirely understand. You might even have sympathy for Parker’s plight.

Perhaps you run your own business and when your costs go up, you simply pass them on to customers.

There is, though, one painful kink here.

Let’s go back to the heady days of 2015. That was a year when fuel prices went down by a lot over a whole year.

You might think, therefore, that airlines reduced their prices accordingly.

You might also think that sautéed mouse is about to become the world’s next culinary delicacy.

Here’s what the editorial board of USA Today had to say in 2015:

You have to go to a special website to see that domestic carriers are still adding hundreds of dollars in fuel surcharges to the cost of international flights. For example, the surcharge — now recast as a ‘carrier-imposed surcharge’ — for a round-trip flight on United between New York City and London is a whopping $516. That’s more than 40 percent of the total ticket cost of $1,192.

The cost of jet fuel had fallen by 50 percent since the beginning of 2014. Moreover, one of the alleged reasons baggage fees were introduced was to offset the increasing cost of fuel.

Surely, though, airlines offered excellent reasons as to why the surcharges imposed during times of high fuel prices had to stay.

It depends on your definition of excellence.

Alaska Airlines, for example, offered analysts an intellectual explanation, the succinct version of which was: “Sorry, we’re not lowering prices.”

Actually, I might have been a little generous with the Sorry there.

Let’s turn to for Airlines for America, the lobbying group that represents most of the big airlines, including American. It intimated at the time that airlines needed fuel prices to go down for a year before they might lower their fares.

I contacted American to ask for its current view.

It passed me to Airlines for America, whose spokeswoman offered me this: “Fuel is one cost variable of many airline operating expenses. As with any consumer product, it’s the marketplace and strength of demand that ultimately determines the price, rather than the cost of any one input.”

Ah, so Parker’s intimation may err toward the inaccurate?

But can it really be only the strength of demand that determines the price when, on some routes, there’s very little competition at all?

Airlines for America insists that fares are at historic lows. It’s a touch odd, then, that airlines want to hide the true cost of your fare in new legislation that’s marauding its way through Congress. (It passed the House on Friday.)

Of course, it’s worth noting that airlines have another way to raise fares. They can simply reduce capacity. 

After all, the big four airlines — American, United, Delta and Southwest — own 81 percent of all the seats on U.S. flights.

There’s currently no evidence that a strangulation of available seats will happen, but it’s always worth remembering that it can.

Parker did admit that he’s partly cheerful that fuel prices are going up. As Skift reported, he said that budget airlines will be affected more because their cost base is lower.

“As fuel prices have increased, their costs increase at a rate greater than rest of us,” he said.

The problem with airlines is that, all too often, the relationship between them and passengers is, from the passengers’ point of view, like dating someone who only wants you for your money.

It’s rarely a recipe for happiness, is it?

A little more choice, a little more competition and a lot more affection for the customer might alter that balance.

As Barbra Streisand once mused in a plaintive — but ultimately self-confident — duet with Donna Summer: “I always dreamed I’d find the perfect lover.”

The duet was called No More Tears (Enough is Enough).

China's Baidu beats forecasts as it sidesteps censors, boosts ad sales

BEIJING (Reuters) – Chinese internet search giant Baidu Inc (BIDU.O) blew past earnings estimates on Thursday, sending its U.S.-listed shares sharply higher in after-hours trading as investors cheered strong growth in its advertising business.

FILE PHOTO: Baidu’s company logo is seen at its headquarters in Beijing December 17, 2014. REUTERS/Kim Kyung-Hoon/File Photo

The firm, which has had to bounce back from a bruising medical advertising scandal in 2016, said it was now taking extra measures to clean up content on its platforms.

Baidu’s stock climbed 5.5 percent to $252 in late trading after marketing revenue grew 23 percent to 17.2 billion yuan ($3.30 billion), its second fastest quarterly rate in over two years. Net profit rose 23.4 percent versus forecasts of 19.5 percent.

Baidu said it now expects second-quarter revenues of between 24.91 billion yuan ($3.93 billion) and 26.19 billion yuan ($4.13 billion), indicating a 19.3-25.4 percent rise versus estimates of a 15.9 percent climb according to Thomson Reuters I/B/E/S.

The search firm’s news feed received an unexpected bump in the first quarter due to a crackdown by Chinese internet regulators on low-brow content, which saw several competing apps targeted during a key client-acquisition period.

Jinri Toutiao, one of China’s most popular news feed apps and a key Baidu rival, is among the apps to have been punished by censors this year. The news feed platform was temporarily removed from local app stores earlier this month.

“This is a one-off, but the timing is quite interesting. Baidu in its marketing can now assure clients that at least on the feed side it has safer content,” said Pacific Epoch analyst Raymond Feng, adding it made the firm “a more reliable choice”.

Analysts said the early year bump boded well for the rest of 2018, as advertisers tended to sign year-long contracts.

During a conference call with analysts on Friday, Chief Executive Robin Li said the company has employed artificial intelligence technology to target click-bait and inappropriate content. Baidu removed 20.2 billion malicious web pages in 2017.

The helping hand from censors is in stark contrast to Baidu’s woes two years ago, when regulators cracked down on its advertising practices, gutting its marketing client base and bringing its revenue growth to a grinding halt.

The company has since sold or withdrawn from a number of businesses to focus on autonomous driving, AI and its news feed product, regaining momentum and investor confidence.

Baidu’s total revenue rose about 24 percent to 20.91 billion yuan in the three months to March 31, its slowest rate in three quarters despite topping analyst estimates.

The positive results also reflect lower-than-expected research and development expenses, which have sky-rocketed since Baidu’s pivot to AI due to steep costs hiring top talent.

Content costs linked to Baidu’s newly listed entertainment subsidiary iQiyi Inc (IQ.O) are also expected to squeeze its margins for the foreseeable future, amid stiff competition in online content from rival Tencent Holdings Ltd (0700.HK).

Excluding one-time items, Baidu reported earnings of 16.30 yuan per American depositary share, above expectations of 10.57 yuan.

Baidu’s net income was also boosted by a new accounting standard that requires companies to report the value of their investments in private companies.

($1 = 6.3340 Chinese yuan renminbi)

(This version of the story corrects marketing revenue, which was up 23 pct to 17.2 billion, not 31 pct to 20.9 billion)

Reporting by Cate Cadell in BEIJING; Additional reporting by Munsif Vengattil in BENGALURU; Editing by Sai Sachin Ravikumar and Christopher CUshing

TSMC to invest $14 billion in R&D at Hsinchu facility

TAIPEI (Reuters) – Taiwan Semiconductor Manufacturing Co (2330.TW), the world’s largest contract chipmaker, is planning a T$400 billion ($13.50 billion) investment to expand its research and development capacity for future technologies, a company spokeswoman said on Friday.

FILE PHOTO: A logo of Taiwan Semiconductor Manufacturing Co (TSMC) is seen at its headquarters in Hsinchu, Taiwan October 5, 2017. REUTERS/Eason Lam/File Photo

The initial planned investment is a “ballpark figure” and is for several years down the line, Elizabeth Sun told Reuters in a phone call.

The proposed investment is subject to the government’s ability to procure and integrate more land into the Hsinchu science park in Taiwan, which is currently full, as well as to environmental assessments, Sun added.

Hsinchu serves as the company’s headquarters, a major production facility, and its research and development center, which focuses on future chip technology.

“This piece of land, if we’re able to acquire it, it would be for all the future R&D activities,” Sun said. “Right now we’re already doing 5 nanonmeter R&D. In the future, it’ll be 3 nm and beyond.”

Earlier this month, TSMC revised its full-year revenue target to the low end of its earlier forecast due to softer demand for smartphones and uncertainty in cryptocurrency mining market.

At the same time, it said it expects high-performance computing chips to make up a greater share of the company’s growth over the next five years. The chips are used in such quick-growing fields as artificial intelligence, cryptocurrency mining and blockchain.

($1 = 29.6380 Taiwan dollars)

Reporting by Jess Macy Yu; Editing by Kim Coghill

Five Things You Should Know About GDPR

The incoming General Data Protection Regulation (GDPR) launch is a hot topic at the moment, and that’s not surprising. After all, it’s the most significant change to data protection laws in several years, and it also includes the deployment of heavy penalties for companies that break the rules. Only time will tell just how strictly the rules are enforced, but the legislation is set up in such a way that if you do get caught out, you’re in a lot of trouble.

One of the big problems with the GDPR is that a lot of people don’t understand what’s covered and what they need to do to make sure that they’re in full compliance. That’s why we’ve done the hard work for you by bringing everything you need to know together into a single blog post. So if you’re not quite sure what the deal is then don’t worry. Read on to find out the five most important things you need to know about GDPR.

1.     It comes into effect soon

The deadline for GDPR compliance is Friday 25th May 2018, and any company that’s not in full compliance after that date could find itself on the receiving end of some steep penalties including massive fines. It’s likely that some lenience will be given during the early days of the new regulations, especially if companies can prove that they’re already taking steps to bring themselves into full compliance. But it’s better to make sure that you’re not breaking the law in the first place.

Honestly, if you haven’t started on the path to compliance then it’s probably too late to bring your company fully in line by the time that the deadline passes. That doesn’t mean you shouldn’t try, though.

2.     It has massive penalties

The GDPR relies on two different tiers of fine that you can be subjected to, but either way it’s better to avoid them altogether. The lower tier comes in at up to €10 million or 2% of the company’s annual global turnover, while the higher tier comes in at up to €20 million or 4% of the annual global turnover. And of course, they’ll fine you whichever is highest.

These penalties are much more severe than the penalties attached to previous legislation, and there’s a reason for that. Older laws have failed to keep up with the amount of data that we create and the importance that we place on it. Data is now a more valuable commodity than oil. It’s no wonder that the fines for non-compliance are so high.

3.     It holds true across the world

Many people seem to think that they’ll get away with non-compliance because they’re not physically located in the European Union. After all, it’s EU legislation and it covers EU citizens. But the companies that believe this will be the ones who find themselves with an unexpected fine that could cripple their company.

That’s because the GDPR applies to any business that processes the personal data of EU citizens, regardless of where those businesses are physically located. So if your website allows EU visitors to create accounts or if you ship products and process payments internationally, you’re covered by the GDPR and you need to make sure that you’re in compliance.

4.     It’s topical

Let’s face it, data and the way that companies are allowed to use it are hot topics at the moment, thanks in part to the Cambridge Analytica scandal that’s currently rocking Facebook. If personal data can be used to sway elections then it’s no surprise that the use and storage of personal data is under more scrutiny than ever.

In fact, while the fines for GDPR non-compliance can be devastating and enough to kill a company outright, the damage to your reputation could be just as bad. Nobody’s going to want to work with you or to buy from you if they think that you’re not going to keep their data secure, and you’re sure to hit the news if you’re one of the first people to be fined.

5.     It’s for the common good

At the end of the day, new regulations like these are only being developed in the first place because it’s in the best interests of the general public. GDPR is designed to protect people’s privacy and not just to cause inconvenience to digital marketers. Sure, it might seem like an inconvenience to make sure that your approach to data collection and data processing is up to date, but you’re not just doing it for yourself. You’re doing it to protect your customers.

When you look at it like that, GDPR compliance is a no-brainer. After all, it’s difficult to overstate how important it is to put customers first in today’s digital landscape. GDPR is designed to give consumers more control over their data and to make it easier than ever before for them to stand up to abuse and misuse by big companies. Ultimately, the changes that it’s ushering in are for the good of all of us, which means that if you’re handling customers’ data, you have a moral responsibility to treat it in a responsible way.

Conclusion

By now, you should have a good idea of what GDPR is and how it might affect you. The next step is to make sure that you’re in full compliance, whether that involves bringing in an external agency to help out or whether that means dramatically redeveloping your own internal policies to make sure that you’re in compliance.

The good news is that by taking steps now, you can save yourself a lot of trouble down the line. Even if you ignore the risk of being fined, there’s still the potential damage to your company’s reputation if you’re exposed as being in breach of the GDPR. Not to mention the fact that if it’s your department that gets it wrong, someone’s going to find themselves looking for a new job.

Ultimately, if this article does nothing else, we hope it acts as a reminder that GDPR is coming and that if you’re not prepared for it, it’s going to cost you a ton of money. Honestly, it’s not worth the risk of non-compliance. And if you fail to update your systems and get penalized then don’t say we didn’t warn you. Good luck.

Europe's Most Innovative Universities – 2018

For the third year running KU Leuven tops Reuters ranking of Europe’s most innovative universities, a list that identifies and ranks the educational institutions doing the most to advance science, invent new technologies and power new markets and industries. A Dutch-speaking school based in Belgium’s Flanders region KU Leuven was founded in 1425 by Pope Martin V and continually produces a high volume of influential inventions. Patents filed by KU scientists are frequently cited by other researchers in academia and in private industry. That’s one of the key criteria in Reuters’ ranking, which was compiled in partnership with Clarivate Analytics, and is based on proprietary data and analysis of patent filings and research paper citations.

1. The library of the university KU Leuven “Katholieke Universiteit Leuven” is pictured in Leuven, Belgium, June 8, 2016. REUTERS/Francois Lenoir

Overall, the most elite ranks of Europe’s Most Innovative Universities have held steady from last year, with the UK’s Imperial College London (#2) and University of Cambridge (#3) holding onto their top spots for the third straight year. Other leading institutions simply traded a few spaces, like the Federal Polytechnic School of Lausanne (#4, up one), University of Erlangen Nuremberg (#5, up one), and the Technical University of Munich (#6, down two). The remainder of the universities in the top 10 moved up from the teens: The University of Manchester (#7, up nine), University of Munich (#8, up four), Technical University of Denmark (#9, up five), and ETH Zurich (#10, up one).

But even though the usual suspects continue to dominate Europe’s Most Innovative Universities, political uncertainty may be causing a big swing in where innovation happens. The trend is most clear if you consider the sum of changes in rank for each country’s institutions: The 23 German universities on this year’s list cumulatively rose 23 spots, more than any other country. Switzerland was second, with five universities up a total of 8 spots. And in contrast, the list’s 21 UK-based universities dropped a cumulative 35 spots.

2. Students walk out of a faculty building of Imperial College London, Britain, May 27, 2016. REUTERS/Toby Melville/File Photo

Why is this shift occurring? The United Kingdom’s “Brexit” from the European Union is almost a year away, but Europe’s scientific community may already be leaving the UK in favor of research institutions on the continent. A February 2018 study published by the UK-based Centre for Global Higher Education reports that many German academics view Brexit as an “advantage,” and hope to use it to attract UK researchers to German universities; in turn, UK academics report that their own postdocs aren’t seeking positions in the UK and are looking at the EU or United States instead. And as Brexit actually unfolds, it could get worse: A November 2017 study performed by the School of International Futures for the UK’s Royal Society describes a possible post-secession United Kingdom where universities compete for a shrinking pool of skilled workers, projects that used to receive EU funding wither, researchers receive fewer invites to join consortia and attend conferences, and overseas collaboration is limited. Similarly, EU-based businesses that fund research at universities may prefer to keep their investments within the region in order to avoid the tax and regulatory headaches of working with post-Brexit UK institutions.

The government of Germany has also established itself as notably pro-science, increasing federal research budgets and encouraging growth in emerging industries such as renewable energy. (German Chancellor Angela Merkel actually holds a doctorate in quantum chemistry, and worked as a research scientist before she entered politics.) According to a 2017 analysis published in the science journal “Nature,” researchers are “flocking to the country,” in part due to the country’s €4.6-billion “Excellence Initiative,” which has helped to attract at least 4,000 foreign scientists to Germany since 2005. And in 2016, the German Research Foundation (Deutsche Forschungsgemeinschaft, or DFG), the country’s main funding agency, allocated a record €2.9 billion in grants, posting a success rate for individual grant proposals higher than comparable UK rates.

Slideshow (8 Images)

This year’s university ranking also shows how smaller countries can have an outsized presence in the world of innovation. Belgium has seven schools on the list, but with a population of only 11 million people, it can boast more top 100 innovative universities per capita than any other country in Europe. On the same per capita basis, the second most innovative country on the list is Switzerland, followed by Denmark, the Netherlands, and the Republic of Ireland. And some large countries underperform despite bigger populations and economies. Russia is Europe’s most populous country and boasts the region’s fifth largest economy, yet none of its universities count among the top 100.

To compile the ranking of Europe’s most innovative universities, Clarivate Analytics (formerly the Intellectual Property & Science business of Thomson Reuters) began by identifying more than 600 global organizations that published the most articles in academic journals, including educational institutions, nonprofit charities, and government-funded institutions. That list was reduced to institutions that filed at least 50 patents with the World Intellectual Property Organization in the period between 2011 and 2016. Then they evaluated each candidate on 10 different metrics, focusing on academic papers (which indicate basic research) and patent filings (which point to an institution’s ability to apply research and commercialize its discoveries). Finally, they trimmed the list so that it only included European universities, and then ranked them based on their performance.

Of course, the relative ranking of any university does not provide a complete picture of whether its researchers are doing important, innovative work. Since the ranking measures innovation on an institutional level, it may overlook particularly innovative departments or programs: a university might rank low for overall innovation but still operate one of the world’s most innovative oncology research centers, for instance. And it’s important to remember that whether a university ranks at the top or the bottom of the list, it’s still within the top 100 on the continent: All of these universities produce original research, create useful technology and stimulate the global economy.

To see the full methodology, click here.

(Editing by Arlyn Gajilan and Alessandra Rafferty)