>A national privacy law would require that companies disclose how they collect your information, use your information, and offer a right-to-be-forgotten, Benioff explained. “If you want to delete your information, you could hit that button and be sure your data is gone forever.”
Okay, want to start by adding that button the Salesforce DMP Page? Right now, it looks like the best I can do is get an opt-out cookie that expires in 6 months . I'll wait.
Edit: I was being facetious. I know exactly why he said this and I don't expect Salesforce to do this until they legally have to. I just hope the irony wasn't lost on anyone.
No it's not because SF won't be allowed to do it either. It's like a power company advocating for the banning of coal while still using it because its the cheapest option. You can participate in a practice to stay competitive while advocating for the banning of that practice.
This is unquestionably true, but it is an awful excuse to not enact any regulations. Honestly, what do people expect is the answer if the free market has already shown to fail to address this problem and we are ruling out regulation because is increases the barrier to entry?
"regulations could avoid security breaches anyway"
Translucent Databases 2nd Ed: Confusion, Misdirection, Randomness, Sharing, Authentication And Steganography To Defend Privacy http://a.co/c78Gij0
TL;DR: All demographic records are stored encrypted, are no longer retrievable if you lose the signing key. Think "proper password storage" extended to all things.
Bonus: Support for GDPR "right to be forgotten" for free. Just erase the key(s).
Edit: To be clear, I'm not saying anything about the necessity of any regulations. I'm just saying that when evaluating possible future regulations, viewing just in the light of incumbent/newcomer dynamics alone will give you absurd results.
Dow, having established cash flow and infrastructure, can trivially bear whatever these costs. These regulations were even created because Dow themselves (et al), somewhere in the middle of their life, optimized their profit by dumping in rivers.
You personally know the environmental harm and grave illegality  of improperly disposing of waste products, and your company is small enough that you can be sure everybody is of similar mind - you're focused on solving technical problems, having not yet been taken over by beancount-maximizers. But you still must pay the costs of the overbearing "compliance" paperwork designed around large amoral entities, perhaps even having to hire a dedicated government bureaucrat fresh out of law school. This is the gatekeeping-legislation dynamic people complain about.
 Which given your corporate size and lack of TBTF, would be a criminal penalty rather than a civil wrist slap.
Any business leader who has studied the tobacco industry would do this.
This becomes even more true as machine learning becomes more and more important. The first movers have a strict advantage over later entrants simply because of the amount of data.
As it is, the large players like Facebook, google, Salesforce and a bunch of others like our banks, credit card companies, internet providers are all extremely well established.
A GPDR-like law would open up entirely new ways of doing business partly because these companies could no longer do business the way they are. “Free” services could no longer make their money by selling your data.
We could end up with a more “honest” set of services. Like a social network that’s actually free or paid for by individual contributions.
You're completely disregarding the fact that it takes significant resources to comply with a law like GDPR. Insurance to defray the costs of potential litigation and fines, development time, ongoing compliance (audits etc.), legal expenses, and on and on. Most startups simply can't afford this.
So you may wind up with more "honest" services, but there will be far fewer of them and they'll have more power and leverage than they ever have before because of the lack of competitors and artificial, enormous regulatory barriers to entry. GDPR is a startup killer.
"Insurance to defray the costs of potential litigation and fines, development time, ongoing compliance...GDPR is a startup killer"
That doesn't have to be the case. The law could easily carve-out a "safe harbor" of sorts for companies that commit to not gathering or storing any but the most basic information about their visitors.
"... there will be far fewer of them and they'll have more power..."
Or existing companies could loose a lot of power and leverage because their business-model doesn't makes sense. New ones could pop-up in their place.
I'm more than willing to give up tyrants like Facebook and Equifax for the right to control my own private data.
Privacy is good, long live disruption!
You’re missing the point. These “tyrants” can fully afford to comply (while paying scores of legal staff to scour the law to find and exploit every possible loophole), but startups can’t. That means no competitors will be able to emerge and challenge them. These companies will still be able to do much of whatever they want simply because consumers won’t have a choice. GDRP consolidates market power in the hands of entrenched competitors that can afford to comply.
Privacy is good, long live disruption!
Privacy is good. Killing the ability for startups to compete is bad.
I mean, it’s not like Facebook and Google aren’t already big enough to smother would-be competition in the cradle. In the other scenario competition may still be smothered, but I still get some regulatory privacy protections.
EDIT: Also, as a matter of principle, I’d gladly see dozens of startups burn if it meant broad privacy protection were enacted.
GDPR compliance and "trampling your privacy" are not remotely related. GDPR is massive overkill and unnecessarily burdensome.
I’d gladly see dozens of startups burn if it meant broad privacy protection were enacted.
EU startups will burn - not dozens though, hundreds or thousands of them - and even more will never get the funding to start because no one wants to invest in a business that can be killed instantly by massive fines at the whim of the government. US startups will thrive because they are not subject to GDPR if they don't target EU customers, even if there is some incidental EU traffic to their sites. I don't have to protect your information GDPR style on my US site, even if you are from Germany, as long as I'm not actively trying to get people from the EU to my site. But most sites outside the EU will just block EU traffic anyway (which is what we decided to do). So enjoy your new, smaller Internet with companies that will "trample your privacy" anyway because you have no competitors to go to for their services. Yes, you will be informed about what they're doing in vague terms, and yes you will have given them "informed consent"....but is it really consent if you have to give it because there are no alternatives?
Again, most “new innovative” companies can’t afford to comply with laws like GDPR.
Disruption is startup paradise.
On top of that, the technical controls were actually expensive to implement. The issue is not that consent is hard to revoke or that data is hard to delete. It's that everything needs to be auditable. Under a strict reading, you need to have an audit trail of every processing activity. Anytime a user profile is edited for any reason, even just an automated removal of whitespace, is supposed to be auditable and made notifiable to the user. Anytime you refresh a user's Instagram stats, you need to have an audit of that. Most companies don't go this far, but it's a huge risk to do anything less because we don't know what fines are going to look like in practice yet.
You’re saying this based on what exactly? Have you actually been in charge of trying to comply? I have. For us, it was going to be a 7 figure endeavor (upfront, and then 6 figures/yr for ongoing compliance, insurance, etc.) and we are pretty small (a few million visitors/mo spread across several sites). We made the decision to just block EU traffic, even though we have been advised that GDPR probably doesn’t apply to us anyway. I can’t have one of the 28 countries that GDPR applies to randomly decide that it applies to me because they’re a little short on tax revenue that month, then have an expensive legal fight that will end in a multimillion-dollar fine anyway.
Those were just on the first page of Google. There are hundreds of others. That's not even counting my own experience with it.
For example, Facebook is going to force everyone to opt-in to accept the status quo while remaining in GDPR compliance:
Good luck doing that if you are a startup!
- data you give it (birthday, religion, likes)
- data it learns about you. (ad tracking, habits, demographics)
facebook built a developer tool that let app developers ASK USERS for "data you gave it"
users clicked YES when prompted with the question, GIVING the app developers the data. Cambridge Analytica NEVER got to touch "data facebook learned about you."
the way the developer tools were designed, when you gave the app access to your facebook account, it could look at things YOUR FRIENDS had made avaliable to YOU. The app could act as you, and see what you see. You can never see "data facebook learned about your friends."
The issue is, people couldnt trust their friends not to click "SHARE DATA." The issue is IF its inappropriate for apps I install to see data shared with me by my friends. Once facebook learned this ability was being misused, they shut it down, back in 2014.
Core Salesforce is a company data store often requiring human data input. It’s where your information goes when you fill out a “Contact us” form on somebody’s website.
If the starting table stakes to get started becomes GDPR-style compliance infrastructure that takes 3 engineering man years to implement properly, then new companies will happen less, which means less future competition for the incumbent.
It's using the legal system as a tool for their competitive advantage -- force everyone to do the thing you're already doing.
> Are those things we want other companies to be doing, though?
Apparently if someone got successful doing something bad once, we're supposed to be OK with people doing that bad thing forever, lest we risk some "regulatory capture" boogeyman.
The solution to regulatory capture is an actual solution to regulatory capture, not a general aversion to regulation.
Few things terrify incumbents more than a fast moving nimble and smart team of innovators. e.g. FB paying $19bn for WhatsApp.
And I give less than zero shits about any startup whose business model necessitates violating my privacy--if privacy laws create barriers to entry, that's absolutely fine. The rights to privacy that some new laws could give us is simply more important than that.
A legal regime that does the former without doing the latter? Sign me up. One that does both? Well, I guess I do still get privacy laws, but are they worth a damn if the latter is among their effects?
The GDPR is very expensive. Every data processing activity needs to be auditable, every customer relationship needs a data processing agreement (and each new custom one needs to be reviewed by legal), we now pay fees to all sorts of data protection authorities. On top of that, it forced us into a new, more expensive insurance policy.
What fees? In the UK you have to register with the authority (as UK companies did before) and that fee had a max of £2900 per year for companies with a turnover greater than £36million or more than 250 members of staff
Orgs under that limit pay £35 - £60 depending on size. Non profits pay zero, companies who only process data for things like staff admin pay zero, there are other exclusions
If you're engaging in unethical behavior that has a market, and you want to stop the behavior, withdrawing from the market definitely doesn't stop it. It encourages a new player to step up and have an advantage over you, and worse, repeat that unethical behavior.
'This is absolutely hilarious. This dude has a huge herd and makes a fortune selling sheep that were fattened from grazing the commons.'
 Yes, I know that the story that the "tragedy of the commons" parable is based on is fake, and they actually had sensible customs to prevent the free-for-all; this is just for illustrative purposes.
This is unfortunate. As more and more big tech companies do this, the tech industry is turning more and more into the bad guy, and something I didn't want to be involved in.
I don't know, maybe he's being authentic and is seeing a problem, but the way the industry has been headed just makes me not trust it.
This is true of most established industries like Pharmaceuticals, Chemical, Nuclear, Mining, and Automotive.
I get where youre coming from and Salesforce does it's share of this, but they aren't Google or Facebook deriving most of their profit from user data.
Their app is used mainly to drive and automate business process in a system that can serve the entire organization. A lot of business (the run the business part) run a very high percentage of their business on Salesforce - including some of the biggest companies in the world.
They make their money from subscriptions, the appExchange, and services.
I'm sure they collect data (and some of their marketing/lead generation software I'm sure collects data), but it might be a little much to say that the entire product is built around collecting user data.
On top of that, Salesforce does a TON to portray themselves as an extremely liberal company - from diversity initiatives and minority outreach to widely marketing their charitable contributions. So at least on the surface they are consistent with their messaging.
Not a shill - just sharing what I know from companies where I've worked that used salesforce in one way or another.
Right, and as I've told others, you either don't know what Salesforce DMP is or you don't know what a DMP is. To say Salesforce's only offering is the CRM aspect of their product is just wrong.
I’ve done my share of implementations and marketing cloud is just one of their product groupings and it isn’t even the one that is used the most. In fact, most organizations (esp big ones) usually have differential tools for the marketing piece.
They offer other data services, but again even in orgs that have purchased them I haven’t seen them used that much.
There are so many other parts of Salesforce that are extensively used outside of marketing, that I’m not sure how their crm is not the main thing.
I even said I’m my post that they definitely do the data collecting etc, just that it’s not their primary source of revenue like fb et al.
Marketing Cloud is Salesforce's email marketing platform, formerly ExactTarget before being acquired.
My Joke was that I'd love to see Benioff put a "export my data and forget about me" button on Salesforce DMP (formly Krux)'s site.
> They offer other data services, but again even in orgs that have purchased them I haven’t seen them used that much.
And on that note, let's wrap this up.
They make their money primarily off od subscriptions- not sure why that statement is so controversial.
And that last point you quoted was specifically made to illustrate that their data services aren’t even that popular. Companies that buy it often dont even use it.
Sounds like you have a fairly easy way to remain opted out. Either that or using the never ending amount of adblockers to block trackers specifically?
"Change the law so I don't have to do scummy things," is actually fairly rational.
That's a common misconception. It's not true.
I'm no lawyer, but the way I read that decision is: do what you think best, just don't purposefully mine the field against one of your shareholders.
The practical question is not necessarily whether Craigslist has a duty to maximize shareholder value, but rather, to what extent can shareholders use the courts to second-guess the business decisions of corporate management? As the links in the Google search above attest, decades of precedent seems to suggest: almost no extent at all.
To look at another high-profile example, Apple CEO Tim Cook--in a shareholders meeting!--said "When we work on making our devices accessible by the blind, I don't consider the bloody ROI. If you want me to do things only for ROI reasons, you should get out of this stock." While it made some news, I don't recall a shareholder lawsuit forcing Tim out of Apple for refusing to maximize shareholder value.
We can have a great, and factually-based discussion on whether Newmark, et al violated their contractual duty, under the terms of that specific contract; extrapolating from that to "but fiduciary duuuuty!" (for these narrowly-construed notions of "fiduciary duty") is unsupported by the facts, or the case law.
Dodge v Ford is also, afaik, generally considered a weak precedent for the notion — which was specifically articulated by the state supreme court, and therefore doesn't bind peer state courts, or any Federal courts. ("Persuasive precedent" != "binding precedent".)
Several states' courts have also rejected the same argument in later cases. "The general legal position today is that the business judgment that directors may exercise is expansive. Management decisions will not be challenged where one can point to any rational link to benefiting the corporation as a whole." 
"[Newmark and Buckmaster] did prove that they personally believe craigslist should not be about the business of stockholder wealth maximization, now or in the future. As an abstract matter, there is nothing inappropriate about an organization seeking to aid local, national, and global communities by providing a website for online classifieds that is largely devoid of monetized elements. Indeed, I personally appreciate and admire [Newmark's and Buckmaster's] desire to be of service to communities. The corporate form in which craigslist operates, however, is not an appropriate vehicle for purely philanthropic ends, at least not when there are other stockholders interested in realizing a return on their investment. Jim and Craig opted to form craigslist, Inc. as a for-profit Delaware corporation and voluntarily accepted millions of dollars from eBay as part of a transaction whereby eBay became a stockholder. Having chosen a for-profit corporate form, the craigslist directors are bound by the fiduciary duties and standards that accompany that form. Those standards include acting to promote the value of the corporation for the benefit of its stockholders."
"Throughout this dispute, I have repeatedly read and listened to what look and sound like breach of contract arguments, which eBay uses not to prove Jim and Craig breached a contract, but rather to prove Jim and Craig breached their fiduciary duties. This has been an odd exercise, and I admit I am puzzled by eBay’s decision not to bring a breach of contract claim or, more promising perhaps, a claim for breach of the implied covenant, considering eBay expended significant effort arguing that the 2008 Board Actions violated both the technical provisions and the spirit of the SPA and the Shareholders’ Agreement. The fact remains, however, that eBay asserted neither a breach of contract claim nor a claim for breach of the implied covenant."
The presiding judge in this very case — whom an article I read on it described as, "one of the most influential corporate jurists in the country" — doesn't think eBay's theory of the case is the right one. It's the only one they brought (to the exclusion of the arguments he thought more apt), though, so he can't rule on them.
That is: eBay structured their minority shareholder agreement with Newmark, et al, and then their case over breach of that agreement, to make it look like a fiduciary duty claim, without offering the court a more accurate alternative.
It's, IMO, a bit specious to take something that was structured specifically so as to be construed in a way that is at odds with the reality of the situation — as specifically cited by the most relevant authority possible, in context — as evidence of the conclusion they (eBay) want you to reach.
EDIT: Though not entirely apt, it's a bit like a prosecutor bringing only a murder charge against a defendant, and not offering the jury the choice to convict instead on a "lesser included offense" like manslaughter, or negligent homicide, or something. Yes, there's clearly a tort in play here, but eBay's insistence that it's this specific tort doesn't make it so.
The section you quoted makes clear that whether or not it was a breach of contract, it certainly was a breach of fiduciary duty because those duties were the sole basis of the complaint.
From what I've read, if eBay had not bought that stake under those specific terms, there wouldn't have been a case here. (Or at least not this case.)
Again, IANAL, but I think that makes it more of a contractual duty case. A dog wearing a bill and wings, with a collar that quacks, is apparently a duck — if your lawyer is good enough…
There's a lot of space between incorporating as a for-profit corporation and then not seriously pursuing profits and the trope that if some action is not maximizing profits right now, you're not allowed to do it. For example, corporations do routinely make charitable contributions.
If Benioff starts talking about the importance of privacy and can make an argument that it aligns well with the ethical standards of the company, then he's probably safe.
He doesn't have to always do the thing that's most profitable as long as he isn't straying outside of the guidelines that have been set for investors.
The issue that seems to be at stake with eBay vs. Newmark is that a company seems to be prevented from leaving a significant amount of possible profit on the table in order to provide direct amorphous distributed good - in craigslist's case the common person not having to be subject to (graphical) psychological manipulation when viewing classified ads.
A contrasting example is of Bill Gates's modern philanthropy efforts. It's certainly great that he's using that Microsoft lucre to make the world a better place. But IMHO it would have made the world an even better place if Microsoft had relaxed their business practices and not put us all through hell in the 90s.
The first approach leaves wealth and self-determination at the edges, while the latter suffers from the standard problems of top-down redistribution.
Practically, the root of the problem in the craigslist case is that they incorporated as for-profit and simply had an informal goal to begin with. A single stakeholder then sold their share to a purely profit-interested entity, which used that minority share to push craigslist down their own desired path (in addition to siphoning trade secrets). It seems like the only thing craigslist could have done is to formalize their philosophy-of-value when eBay wished to buy the shares, or ideally prior.
 Which allows the charity to be quantified. Presumably a shareholder would have a fiduciary duty claim if the board decided to give away 100% of profits.
Even unprofitable companies give to charities. Uber, for example, does this.
It's not uncommon or hypocritical for companies to be in favour of regulation that would prohibit things that they currently do. The whole point is that if Salesforce would currently start respecting privacy more than they'd be legally required to, for plain ethical reasons, but other companies don't do the same, then they have a competitive disadvantage. If the law requires them to do so, they can be more ethical while the playing field is level.
Of course there's still a strong and fair open discussion on how far a company should go in the "totally unethical but technically legal" arena of evil shit. But I don't see much of that discussion in this thread.
Companies often welcome regulation. I once read somewhere that when cigarette companies were forbidden to advertise in the EU, their profits went up. All of them were only advertising to compete with the others, it was an arms race without end. When the entire arms race got outlawed, cost shrunk but income did not change. Smokers didn't suddenly switch brand because they didn't see bad jokes about camels every commercial block.
I also disagree with the argument that this is a call for regulation to keep newcomers out. It's true that bigco's rooting for regulation often do this for anticompetitive reasons and it's abysmal, but I really don't see how increased privacy controls such as the GDPR (but in more places) prevents incumbents from outperforming and outmarketing the big shots. You need to come with a stronger argument about how such regulation affects Salesforce less than a tiny startup. Assuming it's decent regulation, of course - I fully agree if this ends up being a legal minefield.
But eg the GDPR is decent regulation that is really not that hard to abide to unless you're genuinely evil (I say this as the owner of a small EU-based startup). The world could use more of that stuff.
You always start ahead in a contract negotiation when you're the one writing the first draft. It's no different here having an adtech/martech company kick off a privacy discussion; Salesforce wants the upper hand because it's easier to know the ways around the legalese when you're the one writing it.
The key is to say "you're right" to Benioff and then draft the law entirely without his influence.
Privacy and free speech exist in natural conflict. Between the two, I prefer our society which enshrines the latter over the former to the European model which does the reverse.
The "right to be forgotten” debate  neatly encapsulates this conflict. Journalists' rights to pry versus citizens' rights to avoid being pried on is another example .
This is a balancing act. Excessive privacy restricts what third parties can talk about. Unrestricted speech means anyone can say everything they know (or know to be false) about others' private lives. There are multiple equilibria. But acknowledging the trade-off is a pre-requisite to writing good law.
There's plenty to debate on what speech freedoms people should have and what privacy rights people should have, but it's silly to claim that there's no conflict between them conceptually; if you're stating that there's no conflict, you're assuming a certain set of each, which is just you defining the problem a certain way so that you can say it is solved. If other people don't agree with your definition of the problem, then your solution doesn't hold for them.
I'm drawing a distinction between free speech (specifically parrhesia, which is orthogonal to privacy) and freedom to learn anything about anyone (which is antithetical to privacy). In journalism they are combined, usually to good effect; in doxxing the combination is bad.
On the one hand, slavery was banned by 13th amendment.
On the middle handle, alcohol was banned (18th) and then unbanned (21st) by amendment.
On the other hand, nearly everything else in the Constitution is about the function of government and the rights of individuals with respect to government, not the rights of individuals with respect to other individuals.
seems like a case of letting perfect be the enemy of good
Lawmakers tend to copy & reference laws from other places as case studies. So if one law shows up somewhere, that law tends to start spreading around like it's a meme.
I'm not convinced that Benioff is devious enough to have this ulterior motive.
But clearly, social networks have become the new CRM for many small-mid size businesses... and on that front FB represents a threat.
While I know it cannot be reasonably expected that "government" forget us in this manner it certainly could be forced to limit access to data that is identifiable back to an individual without their permission
Example of the details offered, by address or name of owner http://www.cobbassessor.org/cobbga/search/commonsearch.aspx?...
The difference now is that those records are digital and don’t require (in most cases) flipping through actual paper documents.
The question of privacy vs. transparency is an active area of conversation now, particularly because people like yourself are discovering that you never had privacy around some transactions in the first place. Also, machine readability has changed the threat model somewhat.
In general, however, I’d guess that transparency and open access to many kinds of data will continue to be the way the law leans.
I watched Bernioff repeat, over and over, about how much he loved Facebook, for about 30 minutes. This was in 2010 at the Moscone center in San Francisco, right before he introduced Chatter.
Passing laws where people somehow own what other people and companies know about you is not a good idea. I saw your dog poop on my lawn and you did not pick it up. Can I tell my neighbors about that? What about posting on Nextdoor? Local newspaper? A tweet? If I libel or slander someone, we have laws against that. It seems to me that telling a truth you know about someone should not be at the discretion of the someone. People just need to know when they are being observed, what is being observed, and by who (or what), so they can act accordingly. Not sure how to get to that space in this smartphone world.
A restated diary of your life with sub-second and sub-metre accuracy would not violate copyright law. (The initial collection ... might, though if not "an original work of authorship", that case is difficult to make..)
What GP and you are looking for is privacy-specific legislation (or court interpretations).
Sounds like he wants to have regulation for all the aspects where their company innovate