Richard Prince (detail)

Regulation reactions, unintended inferences

Consumers and Innovators Win on a Level Playing Field [spotify / ] It’s why, after careful consideration, Spotify has filed a complaint against Apple with the European Commission (EC), the regulatory body responsible for keeping competition fair and nondiscriminatory. In recent years, Apple has introduced rules to the App Store that purposely limit choice and stifle innovation at the expense of the user experience—essentially acting as both a player and referee to deliberately disadvantage other app developers. After trying unsuccessfully to resolve the issues directly with Apple, we’re now requesting that the EC take action to ensure fair competiti

Regulation reactions: Where Warren’s Wrong [stratechery /2] I do know what is the first thing Senator Warren should do: rectify three clear areas where I believe she is mistaken about technology. Her proposal is wrong about tech’s history, the source of the tech giants’ power, and the fundamental nature of technology itself. All three are, unsurprisingly, interrelated, and it is impossible to craft a cogent antitrust policy without getting all of them right. Benedict’s Newsletter [ /2] But what does that mean? Is Amazon not allowed to sell on its own behalf at all and still have Marketplace – so it has so kill half of the business? Or does this only apply to private-label products? Does that mean Walmart and every other retailer have to shut down private-label products as well (invented c.150 years ago)? Then, Apple isn’t allowed to both have an app store and have apps in the app store… so does that mean when you turn on your phone there are no apps and you have an ‘choose which App Store to use’ screen? Does Apple have to shut down Final Cut Pro (now sold on the Mac App Store)? Or what? —- In the UK..We need tougher scrutiny of Big Tech’s data use and deals [FT /9] The UK should create a digital markets unit, which could sit within the Competition and Markets Authority or a sector regulator, to supervise companies deemed to have “strategic market status”. The unit would enforce a code of conduct as well as open, shared standards. We must make it easier for people to move their personal data from one digital platform to another and improve general access to non-personal or anonymised data.

Arrow of Time and its Reversal on IBM Quantum Computer [pdf] Here we show that, while in nature the complex conjugation needed for time reversal is exponentially improbable, one can design a quantum algorithm that includes complex conjugation and thus reverses a given quantum state. Using this algorithm on an IBM quantum computer enables us to experimentally demonstrate a backward time dynamics for an electron scattered on a two-level impurity (they reversed the direction of time!! -ED)

A New Privacy Constitution for Facebook [BruceSchneier/medium/1 ] What follows is a list of changes we should expect if Facebook is serious about changing its business model and improving user privacy. (a long and strong list of changes needed by someone with a clue -ed)

Unintended inferences: The biggest threat to data privacy and cybersecurity [techrepublic / 19 ] . What is unintended inference? In the research paper A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI, co-authors Sandra Wachter and Brent Mittelstadt of the Oxford Internet Institute at University of Oxford describe how the concept of unintended inference applies in the digital world. The researchers write that artificial intelligence (AI) and big data analytics are able to draw non-intuitive and unverifiable predictions (inferences) about behaviors and preferences: “These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory, biased, and invasive decision-making. Concerns about algorithmic accountability are often actually concerns about the way in which these technologies draw privacy invasive and non-verifiable inferences about us that we cannot predict, understand, or refute.”

A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI [ssrn / 5 ] In this paper we argue that a new data protection right, the ‘right to reasonable inferences’, is needed to help close the accountability gap currently posed ‘high risk inferences’ , meaning inferences that are privacy invasive or reputation damaging and have low verifiability in the sense of being predictive or opinion-based. In cases where algorithms draw ‘high risk inferences’ about individuals, this right would require ex-ante justification to be given by the data controller to establish whether an inference is reasonable. This disclosure would address (1) why certain data is a relevant basis to draw inferences; (2) why these inferences are relevant for the chosen processing purpose or type of automated decision; and (3) whether the data and methods used to draw the inferences are accurate and statistically reliable. report # 4: building an angle detector for journalism [mondaynote / 2] Our goal is to measure the semantic distance between stories within the context of an event, to detect the most original and the deepest story. In doing so, we want to spotlight the key differentiator which is the angle chosen by a journalist or an editor.

Workplace Monitoring and Surveillance [pdf] Technologies are also enabling employers to expand the granularity, scale, and tempo of data collection. Data collected about workers are often fed into systems to inform automated decision-making, to make predictions about workers’ future behaviors, their skills or qualities, as well as their promotion or continued employment. As Adler-Bell and Miller point out, “data-mining techniques innovated in the consumer realm have moved into the workplace.”1 This can alter the power dynamics between workers and employers, as data-driven decision-making can make management more opaque and difficult to interrogate or challenge. Predictive analytics and flagging tools meant to identify rule-breaking can augment biased and discriminatory practices in workplace evaluations and segment workforces into risk categories based on patterns of behavior— such as identifying which employees are mostly likely to leave their jobs. While these tools are touted as bringing greater insight into workforces through a growing array of metrics, workers and others are challenging the power imbalances they generate, as well as their accuracy and fairness on a technical level.

Under the hood: Portal’s Smart Camera [facebook/ 2] The filmmakers that we worked with shared a range of insights, some of which were well-established techniques — such as how experts tend to compose shots and how those decisions influence audience expectations — while others were more instinctual and harder to replicate with AI. For one experiment, we asked a group of professional camera operators to film a series of scenes where it was difficult to capture the action from a single angle. Analyzing the results revealed that while there’s no consistent ground truth for how a seasoned pro films a given situation (camera operators often make different decisions despite sharing the same angle and subjects), there are subtle movements that filmmakers instinctively use to produce a more natural, intuitive camera experience. We carefully analyzed these movements and distilled them into software models that aim to mimic this experience in Smart Camera. These proved more effective than movements guided by simple mathematical strategy

Being Queen’s Roadie was One Intense, Rewarding Job [medium /1] Few people could approach Fred as he prepared for a show, but I would saunter over to him, while he was surrounded by ‘beautiful and important’ people, and ask, ‘Oi! What do you fancy playing tonight then, Fred?’