Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Secretly Saved Videos Users Deleted (nymag.com)
1187 points by walterbell on March 31, 2018 | hide | past | favorite | 387 comments



This reminds me of the long conversations that I used to have with family members and friends several years ago. With their continuous requests to create my own Facebook profile so I can keep in contact with them and with their activities as well as to share my whereabouts. I always used the same argument to reject these suggestions — "I don't want Facebook to have too much data about me, more than the data that you already provided".

I got used to the looks of disbelief, thinking that I was some sort of hermit, an antisocial.

I also got tired of answering the frequent "Why don't you have Facebook?" questions.

I remember the last time I had this conversation with someone, last year (2017) around August. I found a new love partner, and after the long intimate talks on the phone, they requested the usual "intimate pictures", not necessarily sexual but certainly sexy. While I have no tabus with regards to my sexuality, having an understanding of how the Internet works, I have always refused to send that type of images/videos/audios, and I always tried to be patient with the other person to explain my constant denials. Unfortunately, expecting a non-tech-savvy person to understand how data moves around the Internet is most of the time based on hope, and even if they understand, they ultimately don't care because the result doesn't change: you don't get to share something with them and that affects personal interactions.

I am sure that the deletion of media files in services like Facebook has never meant to be absolute. Many of my colleagues believe the same thing that I believe: Facebook and other services do not actually delete data, they just mark it as "deleted" and purge it only if they need the space. The same way a hard drive works, you don't really delete a picture when you hit the "delete" key, nor even if you clear the "trash" folder, the data is still there, where it was, it just loses the links to the metadata.

It is sad how this information becomes news only when bad things happen.


> I am sure that the deletion of media files in services like Facebook has never meant to be absolute. Many of my colleagues believe the same thing that I believe: Facebook and other services do not actually delete data, they just mark it as "deleted" and purge it only if they need the space.

No reason to believe. You can read about the storage architecture used to store photos from a post in 2009 here: https://code.facebook.com/posts/685565858139515/needle-in-a-.... Obviously that might and probably has changed since, but at least at some point that was exactly true.

Quote:

"The delete operation is simple – it marks the needle in the haystack store as deleted by setting a “deleted” bit in the flags field of the needle. However, the associated index record is not modified in any way so an application could end up referencing a deleted needle. A read operation for such a needle will see the “deleted” flag and fail the operation with an appropriate error. The space of a deleted needle is not reclaimed in any way. The only way to reclaim space from deleted needles is to compact the haystack (see below)."


Allow me to ask the obvious question.

Who doesn't do the something like this?

Not to alleviate facebook of blame, but who's to say data on almost every other social media service isn't also just flagged for deletion?


We don't soft delete payloads at Raygun (https://raygun.com), for the very fact that typically if one of our customers wants to delete something it's because they might have sent something they don't want a third party to have. We have filters and other PII filtering tools etc, but it every now and then something might be sent by mistake.

Having said that, you'd be amazed how often folks ask for things to be undeleted (despite a big warning dialog).

Clearly developers pervasively believe soft deletes are occurring everywhere.


It isn’t that hard to combine soft deletes with delayed hard deletes: generate a new encryption key every day for “data deleted today”, and encrypt deleted data with it. After X days, destroy the decryption key.

If you use asymmetric encryption, you can keep the group of people who who can recover “deleted data” small. You could even have an independent party generate your encryption key pair, give you the encryption key, and your customer, on request, the decryption key (I think there is a business model for a non-profit here).


Instead of having a key that you delete (and also build non trivial infrastructure to support), why not delete the actual data?


Because the key is smaller, it is easier to make sure you deleted every copy of that key than that you deleted every copy of the data. The data also might be part of a larger backup that you would have to take apart and reassemble in order to delete the data, or might be in a place where doing that is costly (e.g. on Amazon Glacier)


It seems precisely as easy to make sure you've deleted every copy of the data as it is to make sure you've encrypted every copy of the data.


Edit: apologies, seems I read way too quickly! Thanks for pointing it out.


You seem to be commenting out of context:

> generate a new encryption key every day for “data deleted today”,

The question is not can we encrypt at storage. We’re now talking about encrypting as a soft-deletion method, which means we need to know everywhere the data is stored at deletion time, whether to delete it or to encrypt it with this new “deletion” key.


Thanks for raising that issue, I was somewhat confused by the mentioning of encryption as a soft-deletion method... it made precious little sense to me, but everybody seemed to go along with it and I thought I was missing something very fundamentally ’right’ about that idea. Turns out it’s not so.


Absolutely!


That's great you guys do that. But it cant be proven, why take your word for it?

Ultimately, its the trust that is ghe problem, and that is what needs to be removed eother through new technology or legislation or both.


and why offer the false sense of security?

if they upload a private key, and delete because they "don't want a third party to have". do you also guarantee it wasn't seen or cached anywhere else? I dont know the details of that product, but I usually treat anything uploaded even once as compromised from that point on.


This is the same argument people used to make for why it was fine for capabilities to be unrevocable--someone could have copied the data anyway (or whatever) so there was no point in revoking it. In reality, most of the time nobody but the host of a deleted item has access to the data, has a way to tie it to the originator, and has a motive to use it, especially without significant effort. Being able to delete things is a very important feature (not to mention a legal requirement in many countries!), and it's disturbing to me how many people seem to want to justify a world where every bit of data is saved, forever.


How do you handle:

1. Deletions from backups

2. Deleting material that has been deleted prior to the restoration of the backup?


> Not to alleviate facebook of blame, but who's to say data on almost every other social media service isn't also just flagged for deletion?

The word "delete" has a pretty clear definition to most users. Facebook is one of the most used pieces of software in the world. If FB is allowed to lie to its users, it would indeed give a pass to just about every social media service out there.

The reason Facebook is special, and deserves special scrutiny, is because of its power. If FB establishes a bad behavior, it will become the norm.


A more prudent question would be whether these tech companies should be reined in by federal privacy law. Should they be allowed to collect, trade and analyze private data on all of its users? Where do we draw the line in the sand, in terms of what's acceptable and not.

These are incredibly important questions. A related field would be the credit bureaus, such as Equifax. Global companies who store social security numbers and all other sorts of information. We need a national set of rules for these companies to follow.

Not keeping my hopes up, given our Congress is so dysfunctional these days.


I would think undelete also has a clear definition. Technical implementation is orthogonal to these kinds of definitions.


Does it make it ok for Facebook to do it just because similar other companies do it? I say no, all of them should delete something I say to delete. And "everyone does it" is makes it a bigger problem, not a smaller one.


A lot of the big agile companies are using event sourcing. So there isn't even a model to delete. It's all events with the models being created from a snapshot of events. The event stream is usually durable and lives forever.

https://martinfowler.com/eaaDev/EventSourcing.html

So with this type of system nothing is ever "deleted". It's just an event that something is deleted.

This is a common and very scalable system. You don't deal with models, you deal with events (and a model is a snapshot of events).

This is even an issue. Even other companies that aren't event sourcing, but traditional model architecture have backups. You ask something to be deleted and they might actually delete it, but what about last weeks backup? It's not deleted there.


User expectations about a deletion would probably be "make it as though this was never uploaded": no copy, no backup, no recoverable form whatsoever. And yet, if it was not deleted, and there was a problem requiring backups to be used, they would expect to never even know about it, just that all their data would remain.

It's very much against the rules in event sourced systems to change history. But maybe that just doesn't matter. If it means you can never meet a user expectation about privacy, I guess you could tell the user that everything persists indefinitely... or when something is deleted, go back to the upload event and remove it, rebuilding history with any event related to that uploaded item ignored. Putting the user above the "purity" of the software and creating potential problems elsewhere.

Even on backups in long term storage, there could be some process of creating new copies of the backups with any needed modifications on some kind of schedule, so deletions can propagate over time.

Ultimately the challenges here are financial. We could delete things thoroughly if we were willing to pay for the developer time and other resources needed to make it work.


I was thinking the other day that GDPR might make event sourcing problematic.



"You are strongly encouraged to backup a database before excising data."


You are supposed to delete old backups.


Why? Disk space is cheap.


Why would they? They implement their system the way they want to. Also, this is a completely logical way to deal with deletions. This is what I would do, (what I have done, when I created a simple CMS system). I don't want an endless quarrel with a customer, who "accidentally" deleted something and wants it back. I just turn the switch and it is back.


Like I said, no respect for user choices.


Why even have a delete button? Why lie to your customers about their privacy?


What does this have to do with privacy?

Do you mean the delete button is a lie? Why would it be a lie? Can you or someone else access the deleted video from Facebook.com? Or in another public way? Isn't it deleted from this point of view?

I am not defending Facebook in anyway. I just don't understand why is everybody surprised about these things. Do you think if you click delete on a video on youtube, then it physically deletes the video from all of its servers?


I recall the distinction being made very clear on LiveJournal between "deleted" content vs "purged". I would be very surprised if they were not being forthright about this. Of course this was 10 years ago, before the Russian ownership. So I do have reason to believe that not all companies act in deceitful ways when it comes to retention of user data.


Snapchat claims to. And after the trouble they once got into for not deleting media after it was viewed, I believe them.

https://www.snap.com/en-GB/privacy/privacy-policy/


Why would you believe them after they have lied to you once already?


Right, I wonder how much people trust American companies. Imagine a Chinese firm doing the same and how many will trust them again..


It is best to never have any Chinese company store your data. They are by law (and under severe penalties) required to make all data in their possession available to government officials at any time that it is requested. Dictatorships are like that.


Chinese firms are generally required under threat of arrest to store and/or transmit all data on users. A single Chinese firm failing to delete data would have little or no negative impact within China given that they’re probably already secretly required to do exactly this. Do you mean US customer impact?



Nothing in those articles suggest that Snapchat purged user data.

> “resolved most of those concerns over the past year by improving the wording of our privacy policy, app description, and in-app just-in-time notifications.”

Snapchat just changed their messaging to quell user concerns. Once they have a critical mass of users, they are immune to disclosing that Snapchat messages are not truly ephemeral.


I understand your skepticism considering the behavior we've seen from some of these companies recently — but when their settlement with the FTC includes an independent company monitoring their handling of user privacy for 20 years, I think it's safe to trust them on this one.

>Snapchat servers are designed to automatically delete all Snaps after they’ve been viewed by all recipients

>Snapchat servers are designed to automatically delete all unopened Snaps after 30 days

https://support.snapchat.com/en-GB/a/when-are-snaps-chats-de...


Yeah i don't see much difference between this and hitting delete on a file in a local file system. The data itself still sits there until the sectors gets reclaimed, but there is no longer a file name or directory entry associated with them.


The database we use (Vertica) works this way. Nothing is deleted. Instead it is flagged as deleted. A background task may purge old data (older than x). Historical queries show the database state as it was days or weeks ago. If the background task is broken (bug?) then the data stays indefinitelly on disk.


So, like a filesystem more or less

I might just "help" them by uploading more data I guess


File systems eventually overwrite that data, though. FB's system specifically never reclaims it. Why on earth would you ever do that, unless you have absolutely no respect for your users wishes?


Not standing up for FB's other practices, but from a technical stand point there are several reasons, none of which are about not having respect.

- disk space is cheap - deletes are expensive (time) and slow - deletes are harder to scale - can't revert a real delete - delete's don't fit into an event sourcing architecture - append only data is better, more durable

I could go on.


Placing technical convenience above user wishes is absolutely a lack of respect for those wishes. All of your reasons essentially come down to "it's not worth the effort".


Not at all. All my reasons were technical in that events are part of a stream, and delete is just one more event. When you reconstruct the stream, the end product is the item is deleted. But you could recreate the item from the stream so technically not deleted.

Companies that take daily backups. Say a user asked to delete something, do they now go through and comb through their backups (which might even be offsite or in cold storage) and delete it? It's essentially the same thing.


Choosing to adopt a technology that makes deletions impossible absolutely shows a lack of respect for user decisions. Not building in the ability to deep delete from backups is the same. There is no technical restriction on deleting data, just company decisions that make it difficult.


Isn't every business that has backups in the same boat? Event sourcing is just like having continuous backup.

So you're saying that every company that has a backup system, and who don't regularly go through the backups and remove individual files from the backups because users requested it is a lack of respect? So companies that have offsite backups should, according to you, have policies in place where user data is also removed from offsite backups?


I assume that there might be technical reasons to do it that way.

For example: a soft delete may be just a stronger version of public vs private settings. The whole software infrastructure still assumes a link exists and doesn’t need to cover cases where it really isn’t there. I could see how that makes maintaining indexes etc easier.

Flipping a flag and then filtering out results down the line based on the delete setting is probably much easier than actively removing them from an index.

And if deleting is rare (it probably is), then the performance and resource impact should be minimal.


> unless you have absolutely no respect for your users wishes?

Hehe, you mean, like... Facebook? They respect advertisers with money, not users.


What video would you upload? Just nonsense? Is this to become the modern equivalent of a "black fax"?

https://en.wikipedia.org/wiki/Black_fax


A browser addin to help with this helping might help make the world a better place.


Banning accounts using this add-on (breach of ToS) would be a formality for facebook, if this was to become an issue in the first place. (unlikely that a sufficient number of people will bother doing this)


So, it's ok if people die for their growth, but not use a browser extension

At this time, what's the loss in being banned?


Unfortunately there will be no prizes for having been right all along.

Even now, as facebook is burning, statements of how one has quit or will be quitting facebook get swept into the pile of incendiary indignation, with encouragement from all sides.

But never having used facebook, even at significant personal effort as you indicate, one is relegated from "elitist" before to "smug" now.

One day in the future a recruiter will ask why there's nothing about you on the Internet, and you will proudly be able to say: "Because I know the Internet and its dynamics that well" and they will hire you, in awe of your analytical foresight.

That's the dream anyway, because you're more likely to be reported for being suspicious. After facebook there will be another facebook, and another, and people will flock to them just the same, and you get to experience being an antisocial hermit all over again.

Now I made myself sad. "Social Media: even more depressing when you're not on them!"


> Unfortunately there will be no prizes for having been right all along.

Except not having your racy pictures in Facebook's media archive.


What about the option where people return to messaging applications for private matters and keep a Tweetbookdin for public persona ?


Facebook is burning. Lol. They suffer a minor setback and they're "burning". Right now there is not much of alternative to facebook, it will be just fine.


"I am sure that the deletion of media files in services like Facebook has never meant to be absolute." This is very common, I'm sure. There should be a way to request or a right to request permanent deletion, by law, of one's data on site like Facebook. That said, once something is on the internet, anyone can and will archive it (see https://www.reddit.com/r/DataHoarder/). Closing an account, however, should imply permanent deletion. Companies are instead able to operate in a gray area through terms of service agreements that knowingly play on the ignorance of the end user. This common and widespread behavior is a detriment to the user and (arguably) society at-large.


Obviously I'm not privy to the details of this particular requirement, but I'm fairly certain that very few, if any, of our videos actually go away when we delete accounts. (Or even when we delete the videos themselves.) I think this because I've seen images from SMS texts, instagrams, snapchats and things of that nature used in court cases. So law enforcement must have access to that stuff somehow? But, again, I'm not privy to the technical or legal mechanisms they use to make that happen. All that said, I have seen images from services like these in court cases. And defendants have CLAIMED that they had deleted them. (For whatever value of "deleted" exists on the given service.)

So I'm wondering if the services actually have some sort of archiving requirement for law enforcement purposes? Maybe for a certain number of years, they have to save your data or something like that?

If there's anyone who would be familiar with the legal obligations of these services vis-a-vis data archiving I'd be really interested in hearing more about what we should reasonably expect from these services in terms of deletion etc?


> So I'm wondering if the services actually have some sort of archiving requirement for law enforcement purposes? Maybe for a certain number of years, they have to save your data or something like that?

Apart from a handful of specific cases like financial data, the US has no general data-retention laws. You can delete stuff aggressively as long as it's based on a consistent archival policy, not one-off deletions where you risk looking like you chose a particular thing to delete to hide evidence.

You can tell this is possible in practice by looking at how common it is to have aggressive permanent-deletion policies in corporate email, at least outside of tech. A number of big US companies automatically delete read emails in employees' inboxes after N days (with N ranging from 7 (!) to 365), unless the employee specifically takes action to refile the email into a project folder with a different per-project retention policy. The goal of those policies is to reduce companies' exposure to fishing expeditions in future lawsuits by just keeping less email around. To make that effective, the policies really do delete the emails, including from any backup systems.

Given that they have figured out how to perma-delete their own old email, I believe companies could really delete user-deleted content, perhaps after some specified period of time, if they wanted to. But unlike with their own internal emails, they don't have the same incentives to be aggressive about purging that stuff from their servers. If anything, they have the opposite incentive, to keep as much user data around indefinitely as possible.


GDPR is intended to at least force service providers to give folks the right to be forgotten which compels providers to delete data. While it's own Europe, it's difficult to comply without just making general decision about honoring these requests.


Actually, GDPR only requires that any links from the data to the user should be destroyed, so that you can no longer figure out who created the data. This means that a lot of data will be left. And realistically I think that a lot of it will remain identifiable, just like anonymized data can be traced back to real users pretty easily if you have enough data points.


I’m not a GDPR lawyer, but I do live with one.

My understanding is that an image is by itself PII, regardless of whether or not it has any additional information associated with it. I don’t think there’s a way to retain images without contravening GDPR.


Data doesn’t have to be PII to fall under the provisions of the GDPR. Personal Data doesn’t have to identify a person; relates to an identified or identifiable living individual is sufficient (https://ec.europa.eu/info/law/law-topic/data-protection/refo...)


Unless I'm misreading, that criteria rules out data about individuals that are not identified and can't be identified.

When looking at a single datum by itself, this seems to rule out anything except PII i.e. data that identifies or can be used to identify an individual.


I’m not sure I understand what you’re saying, but I think you’re misreading ”Different pieces of information, which collected together can lead to the identification of a particular person, also constitute personal data.”

What that says is that, if (A,B,C) identifies a person, each of A, B, and C, in isolation, is personal data, not that you will be allowed to keep the pair (A,B) if it doesn’t.

One mathematically can cut each bit of information in units of arbitrarily small entropy. So, if taken to the letter, “this user is not Mark Zuckerberg” would be personal data. I doubt jurisprudence will go that far, but we’ll see.


Facial recognition means all pictures with a face are personal data?


No clue. All I did was rule out anything that can't be used to identify someone.

Whether information that can only be used to identify someone but doesn't tell you anything useful about them is still personal data is unclear to me.


If in doubt, a picture tells you medical information.


Actually, GDPR only requires that any links from the data to the user should be destroyed, so that you can no longer figure out who created the data.

Not in this case, because if the photos or videos contain recognisable people then they are themselves personal data.

How far the new subjects rights involving data deletion will go in practice is one of the biggest unknowns with the GDPR. Clearly from a technical point of view we understand that deleting a key isn't the same as deleting data from a disk, and often that would also include deleting a file in a filesystem if the underlying storage isn't robustly wiped as well. Throw in the kinds of distributed architecture, redundancies and backup systems that many organisations use, particularly in the era of cloud-based hosting and off-site backup services, and you have an unfortunate conflict between not truly deleting data (and therefore still having some degree of risk that the data will leak even if it's intended to be beyond use, contrary to the spirit and possibly the letter of the new regulations) and potentially high or even prohibitive implementation costs to ensure robust deletion of all copies of personal data when a suitable request is received.


Pretty certain that videos and photos count as personally identifiable information that have to be deleted.


The right to request deletion of data is already mandated in current laws, GDPR doesn't change much in that regard.

The hard part is actually enforcing it, and assessing compliance.


> Facebook and other services do not actually delete data, they just mark it as "deleted" and purge it only if they need the space.

You may be correct, but that doesn't explain why Facebook decided to include so-called deleted files in a download of user data. Clearly these deleted files are still a part of Facebook user profiles and accessible to company data mining software. Facebook has exposed their own duplicity.


Maybe the Facebook development processes and tracking of tech debt is just shit. First person: "I'll just flag the content and then it won't show on their timeline!" Second person: "I'll just select all the records that belongs to this account when packaging a backup. All the deleted content should be gone!"

But I wouldn't discount your hypothesis.


When storage is cheap, it's rational to develop the delete flag first and think about cleanup later, which means never. The download content thing seems like a low priority project and the poor intern who probably did it didn't want to figure out how each store keeps the delete flag. At least it's honest. Would you be surprised a dd of your sd card showed your deleted photos?


Storage being cheap is irrelevant..when a user requests the data to be deleted. You delete it. Outside of government compliance there is no reason to not comply with that request


<cynical view> When a _customer_ requests data to be deleted, you delete it. Pretty sure Facebook have probably complied with every user-data-deletion request they've ever got from their paying customers - because advertisers are well know for wanting access to less data about the cattle...


I doubt that advertisers requesting to have all their information deleted (apart from tax relevant data) would be more successful.


Fuck their paying customers


That's not true. What if you want to support un-delete? That's a reason.


To my knowledge there is and has never been any option to “undelete” your deleted content on Facebook.


I'd be surprised if there wasn't at least a way to deal with large-scale account hijacking, where the hijacker deleted lots of hijacked accounts' content.


Mm, what about deactivated accounts? You can reactivate them, which would involve undeleting data.


Yes there are, but there are no moral reasons to not comply with it.


If your ethics are such that you believe the state should be able to view data on someone in order to help prosecution of a crime then you could support the retention of data on all users in order to avoid deletions made to hide criminal activity.

Such an ethic creates a moral reasoning to not comply with an individual's wishes in the immediate deletion of data.

(FWIW I'm not defending this position nor suggesting it's the case here, just you said there's no moral reason that can support it, which seems wrong; different ethical systems can provide different reasoned moral outcomes.)


Why, to help people connect better of course...


There might be something to that, in the sense that I could very well see someone in a meeting ask 'well, but what do we do when another user is tagged in a photo' followed by discussion rationalising why that person shouldn't lose out because someone wants to delete data, and someone coming up with the 'solution' of effectively reference counting the data and figuring out when to actually purge the underlying file later (i.e. never)


Maybe they should get someone from the internal department that monitors Facebook employees to come over and show them how to run a tight ship:

https://www.theguardian.com/technology/2018/mar/16/silicon-v...


>Facebook has exposed their own duplicity.

Or possibly they just screwed up. Perhaps the "soft delete" was originally intended to allow "undelete" by the user with delayed purge, and/or single-instance storage with reference counting that they never quite got around to finishing.


> but that doesn't explain why Facebook decided to include so-called deleted files in a download of user data.

This happened because the person tasked with writing the code to build the archive forgot to include the filter for "deleted" records somewhere in the code.

I.e., they forgot the "where is_deleted = false" part below on one or more DB query requests like this:

select * from table where is_deleted = false;

This is the biggest problem with the "soft delete flag in database" method of deletion. Every single query writer, everywhere, forever, must always remember to include the "is_deleted" filter in their queries. And when they don't, what was deleted reappears as if it had never been deleted at all.


If you have soft-deleted user data, then you have user data, so you had better include it.


That is a good point, but flagging shouldn’t be the end of the line for soft deleted data. There should be a process going back and removing everything that was flagged for deletion, prioritized to guarantee deletion within a set time frame but without impacting performance. Meanwhile, most queries should be done through a view that automatically masks out any flagged data. It’s a basic data integrity feature that shouldn’t be left to their API (which is such a fast moving target that one developer doesn’t know what the other is doing much of the time).


Facebook is powerful and insular. Taking it down requires extraordinary organisation. Outrage is helpful in that respect.

Agreement is better than disagreement. Would I prefer we had agreement earlier? Yes. Is agreement today better than agreement tomorrow? Absolutely.

Now that we have a constituency, the important thing is to mobilise. The past is in the past. Our job, in the present, is to protect the future.


Call for a facebook user strike on May 1st.

#May1FBstrike

https://medium.com/@oddbert2000/call-for-a-facebook-users-st...


Excellent - they'll be able to offer advertised another demographic category "People who are vocally privacy conscious, but who aren't prepared to do anything about preserving it if it means they don't get to play Farmville."

Antivirus vendors and shitty vpn services will be all over that.


Why not just every day? Only use it as much as absolutely necessary (to communicate with people you wouldn't be able to reach otherwise) and use competitors instead. Even using FB owned companies (e.g. Whatsapp) would help. While FB still gets some data, they don't get contents (unlike FB messenger all chats are end-to-end encrypted) and most importantly no ad revenues. And lower revenues is what would truly change Facebook's policies.


what is the point?


You think people could organize better than against Trump ? seems unlikely


Trump has many supporters(not me). I don't think any facebook users particularly like facebook, it's just where "everyone" is.


Exactly this.

Who even supports FB?

The media? No, they hate them because they took all their ad revenue.

Republicans? No, they hate them for the censorship controversy that happened a year or so ago, and because it is full of ultra left wing silicon Valley types.

Democrats? No, they hate them for the whole Cambridge analytica, Russian data thing.

Facebook has made a lot of enemies, and there really isn't any sort of constituency that SUPPORTS them.


Bunch of companies get a lot of their customers through Facebok. Those will probably support (finanically and otherwise) FB as long as there are no obviously better ways to get customers. Not for any sentimental reason, just because their business relies on it.


Not necessarily. Businesses are usually fine with making things worse for themselves if it also impacts all of their competitors, preferably even more. That is, as long as a business comes out a bit more ahead of their competitors, a business is usually just fine with making something worse for everybody.

So, even if a business currently gets a lot of revenue from Facebook, as long as a business thinks that other businesses in their field are more dependent on Facebook than they themselves are, they should be fine with Facebook declining.


They have two major things.

1. Money

2. The fact that they are the only source of information that a good percentage of the world uses.

Money can be used to buy power, and they already have a not-insignificant level of control over the flow of information (which is power itself).


Facebook's ability to prop itself up and exist as it does, despite all the animus against it, reminds me of the essay Meditations on Moloch.


Still, absent an adequate replacement people will continue to stay there.


> better than against Trump?

It is more effective to organise for a cause than against a politician. Presidents are intentionally difficult to remove. The bar for promoting action against Facebook is lower than for prompting action against the President.


> It is sad how this information becomes news only when bad things happen.

What bad things? I feel that's the part missing from the argument. People have yet to see or hear what are the negative consequences of all that data being kept or even leaked or re-sold.

The only one they've started to know about is the potential impact on elections, which is pretty hypothetical and weak to most people I feel. Or maybe identity theft, but that's more related to the Equifax leak.

I think its important to rationalise on what are the real consequences of our data no longer being private. Is it really dangerous? What's the worse that could happen? What are the chances of it happening, etc.


> I have always refused to send that type of images/videos/audios

Isn't it still trivial to self-host stuff?

Just send a link to picture (or document or whatever confidential information you want to share) to a password-protected resource on your own server (or even a laptop or desktop machine, if you have globally routeable IP address there). Facebook automation is not that smart to grab the password from the very same conversation, and even if they do - I'm sure they won't do it, knowing you'll catch them in access logs and press charges for unauthorized access.

I doubt many would object and insist on sending via a very specific medium (i.e. strictly require pics in a FB Messenger). Some, of course, may find this inconvenient.


"trivial" and "your own server" together? :) Maybe for some code monkey, but not for my mom. :(

I really do wish self-hosting were more trivial, it would be a better world.


Where I live Internet providers deliberately make self hosting anything extremely hard.

Then they charge often 5x or more their normal price to let you host things, but add lots of exceptions, for example all providers put in contract they can immediately cancel your subscription of they detect you hosting anything irc related, doesn't matter of it is a irc server or a irc bot or a server for a open source irc client...


I think a great way to explain privacy limitations to a non-tech-savvy person is to walk them through using GPG.

Once someone understands public and private keys, and webs of trust, there really isn't much left to learn. For someone who understands keypairs, the limitations of Facebook/Twitter/etc., DRM, etc. are obvious.

It seems most of us are afraid our non-tech-savvy friends and family won't be able to wrap their heads around security, but not understanding it has gotten us into a pretty bad situation. We should really stress the importance of learning about it.


> Once someone understands public and private keys, and webs of trust

Nobody in the general public wants this.


Okay, don't assume people won't be interested in interesting things. Who is this general public, anyway? It's not an homogeneous group; it's made up of physicians, mechanics, teachers, lovers, Doomsday preppers, engineers, preachers, and all kinds of people who have special interests. The thing I see is that if you show them how it matters to them in their special role, rather than to them as members of this general public, they may well take an interest. Some of them may become very deeply interested indeed, if they needed such a thing but didn't know about it until you showed them!


It's interesting to you, not to most other people. Source: 25 years of talking to people about encryption. Most people just want stuff to work, not to know how it works.


Honestly, it's not been that interesting to me in general. It's only interesting to me for the same reason it might be interesting to the sorts of people I enumerated --- because of the ways it can be useful to me. I don't really care about how it works, in depth; I just want it to keep my stuff private. The only difference is that I have just enough technical expertise, as a programmer, that I can see its applicability without having it explained in a sympathetic manner.


An awful lot of people in the general public do.

Especially if their tech-savvy friends are confident they can learn about it - because it really isn't that complex - and if they understand that keypairs and trust are the basis for literally all digital security.


That doesn't mean they shouldn't.


I'm fairly tech savvy (ok, I'm an expert compared to my non-tech family and friends, but not compared to people here). I even had a copy of pgp on my Windows 3.1 machine shortly after Phil Zimmerman created it. I didn't understand it then, and I don't want to understand it now. The better and easier solution has been to avoid putting stuff I don't want anyone to know about me on the internet.


I'm a linux sysadmin, and GPG is horrible. Complicated, complex, with weird naming scheme, multiple programs (gpg vs gpg2), etc - but it's a brilliant example why all of this is so complicated. Other ideas about describing the trouble trust means on the internet are welcome.


> great way to explain privacy limitations to a non-tech-savvy person is to walk them through using GPG.

Have you ever actually successfully done this? More than once? And they continue to use it?

It's a usability nightmare. https://moxie.org/blog/gpg-and-me/


OK. Maybe not GPG specifically.

But keypairs. Everyone should understand keypairs. They are the basis for all of digital security, and they are really not that difficult.


Symmetric encryption is also popular and ubiquitous. I don't think this can be practically explained to most of the population of non-technical people. Encryption schemes also often use hybrids of symmetric and asymmetric encryption; they are useful for different scenarios.


Lol. Sure, and the best way to teach my grandma about computers is to install arch linux on her pc.


>I got used to the looks of disbelief, thinking that I was some sort of hermit, an antisocial.

I know that look.

>I also got tired of answering the frequent "Why don't you have Facebook?" questions.

I solved it by stating flatly "For the same reasons I don't have Twitter.", somehow marking the final period, people still believes I'm a kind of weirdo, but they don't go on asking ...


I went through the same issues with friends and loved ones wanting me to create a Facebook account. I resisted for years with the same arguments you made. It had some unfortunate consequences: https://news.ycombinator.com/item?id=16675681


>Unfortunately, expecting a non-tech-savvy person to understand how data moves around the Internet

Explain how data can be unreadable while it moves. Teach them to use secure communication options. You don't need to be an electric engineer to use a TV remote control.


But no one has made a TV Remote control version of "encrypted Facebook" or even "encrypted eMail".

And heck, there are people who can't use tv remote controls.

The only thing that I'd consider "easy" is encrypted chat (signal). The "issue" there is market fragmentation (arguably a good thing).


What secure communications options specifically?


I always tell people to treat Facebook as if every person you ever meet will be able to see it. It's more or less my public persona. Twitter is more anonymous.


Unless it's encrypted and ephemeral, treat every bit sent out to the internet, a public network, as your public persona.


Yep, another way of looking at it is if it leaves your device, assume the information is eventually open to the public.


Amen. Once you upload it, you should just assume it’s out there forever. It’s probably worth assuming that virtually all anonymity can be pierced, if not now then within a decade or two.

With a few exceptions, anonymity online is ephemeral at best, subject to the motivation of the person/org trying to deanonymize you.


> Twitter is more anonymous.

How did you arrive at that conclusion? I assume Twitter retains everything as well (even "deleted" tweets) and it's all associated with an email address. Or did you mean it in the sense that far fewer people have a Twitter account?


I used to believe Twitter was better. But once you're above a certain number of active (i.e. publicly retweeting) followers there's a pretty high chance that your tweets will end up in the feed that is used to generate the twitter stream archives:

https://archive.org/details/twitterstream?sort=-publicdate

These are tar files that contain bz2 compressed newline separated twitter events as json. These include deletion events as well, so you can for instance easily estimate the time an auto-deleter is set to.

Yes, they're huge archives, but you could still probably process a year of these for particular targets for under $10 on EC2.

Whilst I'm impressed with archive team's efforts, I would be surprised if there aren't some commercial twitter stream consumers that absolutely dwarf this.

Treat everything you put on twitter as public forever and you won't go too far wrong.


Why would you ever believe it was private? I think Twitter is better because everything is explicitly public, rather than dishonestly pretending it's private.


Well, I never said I did believe it was private. When I said better, I should really have said better behaved in respect to deletions.

Because of the twitter stream APIs it's not. But there does seem to be a strange presumption amongst users that deleted tweets are gone from public view and cannot resurface. There are people who use tweets in all manner of ways that they really weren't designed for, some of which involve deleting them after a few minutes.

Many a public figure uses these tweet deletion apps. Some do it for more honest reasons (status count limits -- do they still exist?), others do it to limit their exposure.

In the UK at least, there have been cases of libel where either the claimant or the defendant depended upon twitter and in at least one of these the court admitted the claimant had an unfair advantage by forgetting about having a tweet deletion app attached to their account. The case proceeded and the claimant won despite the acknowledged advantage. To some, this may be seen as a clear message that in the eyes of the judiciary it's okay to delete tweets (evidence) as long as it was through an auto deletion app and the individual concerned forgot about its existence.

I would not at all be surprised if some lawyers to the rich quietly suggest they install a tweet deletion app as general advice upon instruction.


Twitter is more about finding your own social graph of people you find interesting than friends/family/coworkers like Facebook is primarily about. I could have a completely anonymous persona on Twitter and get all the same content. I could use a fake name on Facebook but it wouldn't make as much sense, and I could be reverse engineered with some accuracy from just my social graph. Your family is going to tag you as family, etc. The other non family and friends content on Facebook is more watered down than on Twitter and Facebook wouldn't be worth using for that alone.


How is Twitter more anonymous? In the UK people have been locked up for tweets.

Twitter probably have less data on you, but I doubt it can't be linked direct to you by a TLA, say.


>I remember the last time I had this conversation with someone, last year (2017) around August. I found a new love partner, and after the long intimate talks on the phone, they requested the usual "intimate pictures", not necessarily sexual but certainly sexy.

Why the fuck are these a thing? Couples don't meet in real life much anymore? And how "usual" are they?


Anyone have stats on how widespread this is? My spouse and I avoid being in front of cameras naked even when we're pretty sure the camera isn't enabled. Not that anyone else would really want to see us nude, but why take a chance on accidentally recording material that could be embarrassing?


> expecting a non-tech-savvy person to understand how data moves around the Internet

Then we - the people that do have the necessary technical knowledge - have a duty to teach them what they need to know. This isn't necessarily "how data moves on the internet". Yes, this can be difficult and tedious, but understanding the risk profile for data/networks is increasingly important as networks become involved in everything.

> they ultimately don't care

Again, it's our duty to teach them why they need to care. This probably shouldn't involve a lecture on networking or data analysis, but instead tailoring an explanation to their personal situation and knowledge.


I don't think it's because they don't understand or because they don't care, it's just overwhelming. Think about it, to have any basic grasp of understanding regarding the security infrastructure of the internet you need to have a basic understanding of network connections, how HTTPS works, how files are stored on your computer, how files are sent across computers, how your average database works etc...

Think about the last time you've tried tinkering with something you're a noob at. Maybe it's deciding that you would try fixing your car engine yourself even though you never were a mechanic. Maybe you decided to make a complicated cake and halfway through you realize that you overestimated your pastry skills. Try to remember the feeling of helplessness you felt at that moment, the "I have no idea what I'm doing and I wish I never had started that in the first place". In my experience that's how 90% of people feel like when trying to do something technical with a computer.

A few weeks ago a colleague from HR asked me if I could make a backup of a computer because it contained some critical stuff and she wanted to be able to restore it later if necessary. I say okay, boot up a debian live USB stick I had lying around and start dd'ing the drive to external storage. When I told her the copy was in progress she told me "but I didn't give you the password?". She was amazed when I told her that I didn't need the windows session password to access the data on the disc. I swear I'm not making it up when I say that she asked me if I was a "hacker".

That made me realize that there are probably many people out there who think their files are safe as long as their Windows password isn't compromised even if the disc is not encrypted. After all, they can't access the files, so surely nobody else can? If Facebook says my photo is deleted, then surely it must be? Why wouldn't it be?

I don't think it's fair to blame these people, we've designed so many strange patterns over the past decades in software that it's difficult to keep track. Maybe having "delete" not actually delete should be considered a dark pattern. Maybe it should even be illegal.


"That made me realize that there are probably many people out there who think their files are safe as long as their Windows password isn't compromised even if the disc is not encrypted."

Of course they assume it. Partly also because windows tells you, if you loose your password, you can no longer access your account, which is bs and they know it and tell you only for "felt Security".

And encryption ... What is that?


Are you sure Windows is not showing you that message because you enabled encryption in your account? They encryption key is itself encrypted with your password, so the warning makes sense.

https://en.wikipedia.org/wiki/Encrypting_File_System


Yes, very sure, because I recently had to "crack" several non encrypted Windows 10 PC's. And that message amused/angered me very much when it was 5 min work to prove that message wrong.


And how would we do that? Every time I've tried to explain privacy issues to non tech individuals at best they consider me paranoid and at worse a fucking sociopath who doesn't have a FB profile because I can't correlate with other people. I can't carry this burden and I doubt many can.

There have been horror stories over the years about identity theft, even before the emergence of social media. Has this stopped anyone outside our community from posting details about their lives online? I hardly think this whole situation with FB will change anything in the end.

I don't feel I have any obligation/duty towards anyone. If they want my opinion or ask me about an issue I'll gladly inform them. But I won't start a crusade for a better informed society. Internet was supposed to do that and we ended up with videos of cats and wannabe celebrities posing seminude pics on Instagram. Fuck that shit.


Your view is well represented on the Internet, and is perhaps most aptly exemplified by the early jargon word “luser”, and the BOFH phenomenon. I have never, I think, really been prone to such thinking. I have never had a problem talking to ordinary people or users, or felt the immense frustration which many people have vividly described. (Note: I am a sysadmin with approximately 20 years of professional experience, and have always had a user-facing role as at least a part of my job.)

It reminds me where in Zen Buddhism, there are those who become enlightened and go off to do their own thing, and those who become enlightened and stay in the world with the rest of the ordinary unenlightened people. In the words of Alan Watts:

The understanding of Zen, the understanding of awakening, the understanding of– Well, we’ll call it mystical experiences, one of the most dangerous things in the world. And for a person who cannot contain it, it’s like putting a million volts through your electric shaver. You blow your mind and it stays blown. Now, if you go off in that way, that is what would be called in Buddhism a pratyeka- buddha—“private buddha”. He is one who goes off into the transcendental world and is never seen again. And he’s made a mistake from the standpoint of Buddhism, because from the standpoint of Buddhism, there is no fundamental difference between the transcendental world and this everyday world. The bodhisattva, you see, who doesn’t go off into a nirvana and stay there forever and ever, but comes back and lives ordinary everyday life to help other beings to see through it, too, he doesn’t come back because he feels he has some solemn duty to help mankind and all that kind of pious cant. He comes back because he sees the two worlds are the same. He sees all other beings as buddhas. He sees them, to use a phrase of G.K. Chesterton’s, “but now a great thing in the street, seems any human nod, where move in strange democracies a million masks of god.”

— Alan Watts, Lecture on Zen


> ... instead tailoring an explanation to their personal situation and knowledge.

I’ve used this with success several times. Though you generally have to know the person well enough to know their “secrets”.


> I am sure that the deletion of media files in services like Facebook has never meant to be absolute. Many of my colleagues believe the same thing that I believe: Facebook and other services do not actually delete data, they just mark it as "deleted" and purge it only if they need the space.

This is a dumb conspiracy theory. Facebook has made plenty of public statements that say otherwise, and there's a whole team that works on the system that ensures every trace is erased from disks, logs, cold storage and backups when deleting content.


Looking online briefly for definitions of "delete":

"remove or obliterate (written or printed matter), especially by drawing a line through it or marking it with a delete sign."

"synonyms: remove, cut out, take out, edit out, expunge, excise, eradicate, cancel"

All of these seem clearly "absolute" to me. "Delete" means it's gone.

I think Facebook has its own special linguistic distortion field. It requires no "dumb conspiracy theory" to realize that Facebook cannot be trusted.


Deletion by flag is very common in IT and presumably has been since the first undelete program was created. It's not a Facebook thing.

Some mail programs for a long time have had a soft-delete that requires an expunging process to create compete removal.

In an IT setting you can delete a blob from a db, but it might still be on disk, and it will still be in caches, on user machines, and in backups/archives.


Because FB deletes by flag so that content disappear instantly and then start the actual process of deletion (which can take while because of stuff like backup, cols storage)...


I'm not inclined to believe PR statements like these when there's no way to verify them.

Can you support your assertion? The infrequent cases where someone manages to extract or recover supposedly deleted data cast a lot of doubt on your claims.

In any case, even if it's not Facebook specifically, it seems overwhelmingly likely that the majority of companies do not actually delete your data.


FB is/was regularly audited by Irish DPC, I think one of the topics was user data deletion. I think that the results were public.


To be fair though, the article that this comment thread is attached to offers some seemingly direct evidence to support one aspect of this 'dumb' 'conspiracy' 'theory'.


Did you read the OP? How can you say that this is a dumb conspiracy theory?


First lesson in DB class: do NOT delete. Just flag.

I can give you plenty of statements about how I'm Santa Claus though.



The accepted answer to the first link you posted explicitly calls out:

> There is one class of data that you have to delete - and that's personal data that the user doesn't want you to hold any more. There may be local laws (e.g. in the EU) that makes this a mandatory requirement (thanks Gavin)

This is exactly the type of data we're discussing here. So no, contradicting the user's expectation when handling personal data is not a "best practice".


Disclaimer: I deleted my Facebook account a couple years ago and never looked back.

That said, Facebook is who is just getting collectively stabbed with the pitchfork right now. Engineering best practices are one thing. My right to privacy is another. As an engineer I care about efficiency. As a human I care about privacy. My rights win over any technocratic babble. Sorry if I am being harsh. I am, of course not surprised. Engineers are lazy at best and at worst, something truly sinister is brewing.


I agree that you have the right to privacy, but there's also technical reasons why instant deletion is not always possible. If they can guarantee that the data will be gone after X days, then that's fair to me.


Yeah, that works. A best effort of actual deletion is good enough IMO. Maybe even notification it is actually done being sweeped out of a storage system.


"best" practices...

...as if I didn't already have enough reasons to hate that cliche, thought-terminating phrase... every situation is unique and figuring out what exactly to do for your particular one is probably the main purpose of being a software engineer.


Does this make it right, because others are doing it? I’d say it depends heavily on the type of data and the user expectation.

The correct thing would be to flag as deleted for a sensible period of time (to be able to undo for the user) and then get rid of it after X days when it clearly isn’t needed anymore.


I'm mostly curious how many people are posting angrily about Facebook retaining data while taking a break from implementing a system that retains data.


I'm guilty of the "implementing a system..." part. We are just starting, and data hungry.


Facebook lost billions in days and is poised to lose more.

I imagine there are more pressing issues than the difficulty of implementing a schema.


Not just billions but tens of billions. Somewhere around $50B.


"best practices" is for developers.

For the average Joe and Janet out there, "deleting" something is synonymous to "remove from the internet for eternity"


Except when it’s not, and they want back the data they’ve deleted by mistake.

In those cases it will take a lot of support to explain that what is gone is gone. I think customers don’t have a unified vision of what deleting means, they just want what’s optimal for the situation.


Then let FB show what is happening. Let them change the text on the "Delete" button to "Remove from timeline" or something similar.

How hard is it to clearly state what FB is doing in the background?


>"remove from the internet for eternity" lol, internet never forgets. Everybody seems to have an ilusion of control over digital data shared with others or uploaded to the internet.


From your first link:

> There is one class of data that you have to delete - and that's personal data that the user doesn't want you to hold any more. There may be local laws (e.g. in the EU) that makes this a mandatory requirement (thanks Gavin)


These best practices are about database records and not about files. I'd be very surprised if Facebook store files as database blobs. These are generally stored on a separate system, and it's quite reasonable to delete the file while keeping the metadata in the database.


After so many links, let me just answer with a link as well:

https://en.wikipedia.org/wiki/Confirmation_bias


Privacy hawks are always looking for a reason to complain about Facebook and scream I told you so.


The most unsettling part is in Facebook's response: “We’ve heard that when accessing their information from our Download Your Information tool, some people are seeing their old videos that do not appear on their profile or Activity Log. We are investigating.” Who wants to bet against their investigation being “how to keep users from seeing it.” Anyone?


I honestly don't understand this cynicism. Facebook does not want your deleted video, and they certainly don't want to keep it given the current media frenzy, with the CEO under fire.

Every application of any complexity has features which inactivate, but don't delete data. At Facebook scale, deleting data is non-trivial, and it would be impossible to immediately delete something.

We all have bugs, including extremely critical security bugs, availability-threatening performance bugs, or many other types of bugs. It's strange that we accept those bugs as merely bugs, without assuming a backdoor, or intentional sabotage, but when it comes to personal data, suddenly it's a nefarious plot. It's an odd position to take that Facebook is not only saving these deleted videos intentionally (for what, exactly?) but that they'll now lie to us and pretend to delete them, but only remove it from their Download Information tool.

Kudos to Facebook for even having such a tool.


I agree with you.

At Facebook-scale the data is massive -- far bigger than anyone here could possibly comprehend and that includes the Facebook and Google-ers lurking around.

Data has incredible inertia. And when there's a lot of it, in a lot of different places, I can imagine that it becomes very difficult to keep track of.

I'm glad that Facebook's data export tool included some things that maybe it didn't expect to.


>I can imagine that it becomes very difficult to keep track of.

The GDPR prompted them to make the data resurface, so it's not impossible to track this data given a few months of warning. It's just that Facebook as a company does not have an interest in deleting data they collected.


No - we've had the download your info feature since 2010 (https://techcrunch.com/2010/10/06/facebook-now-allows-you-to...). There is just a lot more public scrutiny now of what's in there - which found this bug.


If it’s too hard to do properly maybe they shouldn’t be doing it /shrug


Can't understand why you're downvoted. If you can't handle the data you collect, maybe you should collect that data in the first place? Or invest in technology, hire more engineers to handle the data you collected?


Don’t allow users to delete things?


Don't gather data you can't manage it ethically.

Facebook has a responsibility to protect user data they have collected, and if they can't then they shouldn't.


After delaying informing users of their data being handed over to third party services and keeping quiet for 3 years, some cynicism is warranted.


they had to have the tool for gdpr. not because they are good guys


The Facebook download tool has been around since 2010. It's not correct to suggest they created this in response to GDPR.


You're right that it's not due to the GDPR, but existing EU law already required them to provide all user info on request. See Max Schrems' work, specifically his 2011 complaints to the Irish Data Protection Commissioner, and the subsequent Europe v. Facebook case.

http://europe-v-facebook.org/EN/Objectives/objectives.html


> Facebook does not want your deleted video

Oh, most certainly they do want that video. Their business is knowing who we are and what drives us, so they can target those ads better. That's what makes their shareholders money.

That the people working there are human beings who might consider it immoral to keep deleted material, is what most people rely on when using such services... but being kind is not Facebook's goal.


It's upsetting that the bug exists in the first place, but there's nothing unsettling about this response. Have you ever reported a bug to an Internet company before? What do you expect them to say?


It's no bet. The investigation results in a ticket for another intern to add "WHERE deleted = 0" to the download tool.


EU Data Protection Law requires that users are entitled to view all personal data (yes that includes videos) that FB has on them.

In May the EU will be able to fine FB up to 4% of global revenue for breeches of this law. Popcorn time!


Great idea! I'll make an FB app for that ;)


Apparently the walls on the garden weren’t high enough - and the sappers are the real culprits!!


One fascinating outcome of all this fallout is that there's now a readymade excuse to stop using Facebook.

My personal observations are that a good number of people have felt 'fatigued' by Facebook for a very long time, but were also unsure of how best to extricate themselves without incurring a social penalty.

But now there's an impetus that most people can understand. I'm not sure about how many people will move away or how quickly it'll happen, but the network effects Facebook capitalized on can also work in reverse: if you have just one or two very vocal privacy proponents in a friend circle pushing to get off the platform. One group I'm in recently migrated to Telegram for this very reason.


If you truly want privacy and security I would recommend Signal over Telegram -- Telegram has had some controversy with respect to their encryption protocol not being audited, as well as some weird stuff with a very large recent ICO that seems entirely unnecessary except as a money grab and Russian subpoenas for their master private keys.


You are getting off one boat and getting on the other one. Destination is the same.

Any centralized communication network is by definition insecure (one point of failure).

Maybe try Tox.

https://tox.chat/

Telegram copperating with Iran

https://news.ycombinator.com/item?id=16039859


Signal and Telegram are very different. Signal has always been an open source project that allows you to audit the source and run your own server if you so desire [1].

It’s a project that has always put security first but made some compromises for usability — very different from Telegram which has put expansion and monetization first — and it was started by Moxie Marlinspike whose views and contributions are well-known.

With Signal, it is not a single point of failure. The Android, iOS, desktop apps all do end-to-end encryption. So a compromised server wouldn’t mean your messages are compromised.

The client would need to be compromised, and if the client is compromised, tox.chat is toast as well.

[1] https://github.com/signalapp


>>With Signal, it is not a single point of failure. The Android, iOS, desktop apps all do end-to-end encryption.

I meant DoS attack not encryption.

AFAIK all of these secure "apps" are NOT decentralized.

So if you can just block a certain IP, you'd have successfuly performed a DoS attack.


> and Russian subpoenas for their master private keys.

While I cannit defend (or attack, I'm no cryptographer) their crypto they seem to have a solution to this:

They say they don't store keys in the same datacenter or even jurisdiction as the customer data they protect.

According to them this means getting unencrypted data through a legal process would mean getting a warrant in two or more countries at once.


> They say

> According to them

I find it very hard trusting their word. And we know the company has the ability to read messages. How is telegram better from FB messenger?


Sorry for my late reply:

> And we know the company has the ability to read messages.

I don’t think we actually know that.

In fact I think they have a system to keep data and keys apart and in different jurisdictions to prevent USA, Russia or anyone from being able to get access to it.

I am no cryptographer or legal expert though.

> How is telegram better from FB messenger?

This is a bit simpler: while Facebook messenger might be E2E encrypted I have good reason to believe that Facebook will datamine my metadata and sell it to however wants to pay.


This only works in the US due to the Cloud Act.


Definitely. I’ve been off Facebook for years but always felt judged for it. In the past few weeks it’s gone from being judged to being applauded for calling it.


Just an anecdote. I had a day set aside to purging my Facebook entries a year or two back. I manually deleted comments and posts.

Of course there was too many to do and it was very boring so I only spent a couple of hours at it. But that's nots what's interesting. What happened was that I got a huge uptake of people commenting on some old post I made, like a profile picture change. I think Facebook saw I was purging my data slowly and reached out to my FB contacts encouraging them to interact more with me. It was very odd.


You made changes to old post (deleted comments), so Facebook decided that because there are some updates to old posts - it makes sense to treat these old posts as new. So Facebook started to show these posts to your friends in their news feeds.


Isn't that against the goal of a deletion?

FB should honor intentions and let deleted stuff not be flagged as "new".


"What to show" prioritization should mostly focus on the interest a reader (not the writer).

I, as a reader, want to see posts with recently deleted [or updated] comments.


If you're worried about FB analyzing/selling your data then "deleting" does nothing. It effectively just sets a boolean flag on a record in a database which is more like 'hiding'.

It may not appear anymore in the frontend but you can be pretty sure it's still being used by FB. Now that may change after GDPR but who knows...


Is there not a way to delete all posts and comments at once?


I don't believe so, at least not in the recent past. I purged my FB of all content about 18 months ago, and had to do it manually. Took several hours spread across a few weeks, whenever I could force myself to spend the time on it. For whatever reason I kept finding posts/comments for a few weeks after that; I'd go back to make sure I got everything, scouring the timelines, and there'd be something I missed somehow, quite bizarre.


Facebook makes it very, very difficult. But there is a method using a Chrome browser extension, I've used this method recently with success: https://www.mariusschober.com/2018/01/20/delete-facebook-act...

It will take some effort, but it's doable.


So your stuff is not deleted when you delete your account (soft or hard)?


I've never had a Facebook account, but friends have told me that no, there is not. That would go against their user and content retention models, I'm sure. It makes sense that Facebook would make it as difficult, tedious, and painful as possible to delete content from their platform.


We should create a browser extension for that.


There used to be this Android app, Exfoliate, which I used to delete old content. It no longer works.

https://www.androidauthority.com/exfoliate-facebook-app-revi...


The funniest part of it: All the media hype around the topic is generated... BY DATA collected by NYT/Bloomberg/Techcrunch/you name it. Those articles generate additional views and they just continue to ride this wave. And all those publications share this data with 3rd parties (ad networks, analytics providers, cpa networks)

On top of that, you know what else do they measure? SENTIMENT. So until kicking Facebook generates more revenue - the articles will paint Facebook as a world's main evil. But the day sentiment changes you will see all the articles about Facebook following best practices.

And in the end? Some EU commission will be created and make a law which oblige to "show cookie usage disclaimer", because of which 90% of sites welcome you with ugly popup and ruin the experience providing 0% advantage in managing your privacy...


> On top of that, you know what else do they measure? SENTIMENT. So until kicking Facebook generates more revenue - the articles will paint Facebook as a world's main evil. But the day sentiment changes you will see all the articles about Facebook following best practices.

So what you're saying is that sites like nymag will only run stories that are profitable?


Kind of. When everyone is writing articles AGAINST Facebook, it will be very hard to 'sell' article which SUPPORTS Facebook to the editor (because of the potential PR nightmare when potential 4chan starts attacking you)

Article can't go through editor -> article is not published


Secretly? Secretly from whom?

There is nothing I've come across, ever, that has lead me to believe that Facebook, Google, Amazon, etc., ever delete anything, ever. Not even to clean up space as some people on this thread are suggesting. Hard drive space is cheap and data is valuable. This isn't a secret, this is a fairly obvious business practice that all the big players, and most competent small players, are engaging in.


FWIW, Facebook does say that they will delete all of your data within 90 days of account deletion. I believe that indicates that they've put the engineering effort to do a full audit of data to be deleted, handle missing references across the product, and to fully delete user data from logs and backups.

From https://www.facebook.com/help/250563911970368 :

> When you delete your account, people won't be able to see it on Facebook. It may take up to 90 days from the beginning of the deletion process to delete all of the things you've posted, like your photos, status updates or other data stored in backup systems.

The case from the article is trickier. My impression is the feature was just implemented with an append-only data model, which is often (maybe usually) a good engineering decision. "Secretly" from the article title feels disingenuous because Facebook never said it was deleted. As an engineer, it's frustrating that I might have to write my software to be more fragile to match the implicit expectations of how a non-technical user thinks software should work. But the frustration on the user's end is also plenty understandable here. Hopefully the gap can be closed a little on both sides by a combination of educating users and being more privacy-conscious in engineering and business decisions.


Fucking stupid policy - so they have 90 days to off load your shit to Utah/gov-cloud before they “delete” your data?

Who can possibly believe this BS.

Imagine you wanted to delete data of your own Sustem - but when you hit rm it takes 90 days to execute - this sentiment PISSES me off.

When I say “delete me from your service now” I have a reasonable expectation that you will delete it.

C’mon


Do you think it's as easy as "rm"-ing a file away? Your data is kept internally in a multitude of different databases. Parts of it sitting in cold storage. Log files, caches. That data is split across thousands of different nodes. Each system has different data retention policies. Some databases don't permit removal of a specific record - the records must "expire" first. It really does take time to delete data.


True, but this isn't an excuse. It's slow to delete data because Facebook designed it that way. They could have designed for privacy and real-time deletion of data, but they didn't, because they didn't care.


> "They could have designed for privacy and real-time deletion of data"

Actually, they could not. If data is geo-replicated across multiple clusters, spread all over the place, divided into hot and cold storage layers - it's crystal clear you can't perform "real time deletion of data". Instantaneous deletion of all data, leaving no trace behind, can not happen under such complex constraints.


>> > "They could have designed for privacy and real-time deletion of data"

> Actually, they could not. If data is geo-replicated across multiple clusters, spread all over the place, divided into hot and cold storage layers - it's crystal clear you can't perform "real time deletion of data". Instantaneous deletion of all data, leaving no trace behind, can not happen under such complex constraints.

Yes, they could have. Your post is just a description of a design that can't delete data quickly. That doesn't prove that no design exists which can delete data quickly.

If Facebook had been designed with "we need to allow users to delete their data quickly and permanently" as a constraint from the beginning, it wouldn't look like the system you've described.

All you've done is pick all the things that Facebook did and say that if you do those things you can't delete data quickly. Yes, that's true--which is why Facebook would not have done those things if they cared about allowing users to delete their data.


It's interesting that you think they need those 90 days to off load your data. As if they hadn't done so before your deletion.

By the way, rm does work like that. The file will just be marked as deleted (by removing its entry from the filesystem index), but will remain on your disk for some time afterwards, from some minutes to months. If you want to ensure deletion, you should be using shred.


> By the way, rm does work like that. The file will just be marked as deleted (by removing its entry from the filesystem index), but will remain on your disk for some time afterwards, from some minutes to months. If you want to ensure deletion, you should be using shred.

This is true, but it's worth noting that not overwriting your own data on a machine you physically own as an optimization is very different behavior from not overwriting someone else's data on your server when they request that you delete it.


I couldn't agree more.


“Rm doesn’t work like that, so clearly it’s ok that Facebook takes 90 days to delete your data after you make the request”

Utter horse crap


If you put words into other people's mouths, you're only creating confusion for yourself. I meant what I wrote, and nothing more.


> Fucking stupid policy - so they have 90 days to off load your shit to Utah/gov-cloud before they “delete” your data?

FWIW I would be surprised if there is not a delay when deleting from S3/Azure Storage/GCS (and thus anything that uses those services).


Google deletes your data within I think 64 days if you request them to or delete your account.

https://support.google.com/accounts/answer/32046


There's nothing there which clarifies that "delete" isn't a euphemism for "flag it to no longer be displayed to users" like it is everywhere else where companies collect data on users, so you'll excuse my skepticism.


Exactly and they have the perfect excuse:

"Well when talking about deleting we mean we do the exact same thing the file system does to a file, it flags it, but doesn't actually erase it's content. Acting like a filesystem delete operation is what people expect when using that word"


That would be somewhat reasonable if it were just an implementation detail. But unfortunately, it's not just an implementation detail. When a filesystem has data to write and runs out of hard drive space, it overwrites the data which was flagged for deletion. But when a web 2.0 company has data to write and runs out of hard drive space, they buy more hard drive space, usually automatically.


It probably is an implementation detail. I'd bet you 64 days is time is takes for all their backups to be rotated.

Not that I think that's legitimate! Implementations can be changed, just like you could write a filesystem that shreds files when they are deleted.


Clearly there is a big disconnect here. It seems somebody is suggesting there should be a correlation between a user removing content from their account and Facebook destroying some of Facebook property.

Anything submitted to Facebook is the property of Facebook. Users have no business telling Facebook to destroy Facebook property.


Facebook disagrees: "You own all of the content and information you post on Facebook"

https://www.facebook.com/terms.php


It also says you grant us a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook (IP License).

This link addresses your concerns by clarifying a prior form of the above mentioned policy: http://legalteamusa.net/tacticalip/2013/02/11/does-facebook-...

I suspect the Facebook IP Policy is a standard blanket of legal compliance if the following is that policy: https://www.facebook.com/help/intellectual_property

In short Facebook can do anything they want with the IP content you provide to them. It also identifies, by example, IP content as media you provide to them. For that material the policy is pretty clear, but what about other material? What about textual content that is typed into Facebook and identified relationships? It seems this information is covered by the same policies and is IP subject to Facebook's use.

My understanding of Facebook policy is also likely dated as their terms change periodically. The current policy is dated at 30 January 2015.


If you read a bit further, it says the license ends when you and the other users with whom you have shared the content delete it, which was the case we were discussing.


> Anything submitted to Facebook is the property of Facebook.

I'm not sure how you came to this conclusion.

Are we to accept that this is how it works simply because Facebook says this is how it works?


You accept that when you join.


Users didn't accept anything just because they checked a checkbox next to a link to an ever-changing jumble of legalese to get past a screen. This isn't agreement, it's manufactured consent.


Unfortunately no matter how many times you say this or how much you wish it were true, US (at least) courts have disagreed with you by enforcing contracts of adhesion.


I think we're in agreement, we're just saying things slightly differently. I believe human rights exist and are an ethical imperative whether or not lawmakers/courts choose to protect human rights.

Or put another way, the law should be (but often isn't) informed by human rights--human rights aren't informed by the law.


Extortion


That's like saying having to pay for any product is extortion, just because it's something you want but don't want the consequences of that, in that case paying for it.


Facebook's terms of services agreement that you must consent to in order to open or maintain your account. These are the rules not because Facebook says so, but because you say so when you agree to their terms.


Let's be clear here: I'm not the Facebook user in question. I've never uploaded a video to Facebook and never will.

Users didn't sign or agree to anything just because they checked a checkbox next to a link to an ever-changing jumble of legalese to get past a screen. This isn't agreement, it's manufactured consent.


> Users didn't sign or agree to anything just because they checked a checkbox next to a link to an ever-changing jumble of legalese to get past a screen. This isn't agreement, it's manufactured consent.

What is the difference? I am thinking if a person really actually cared they would have read the legal agreement before checking the checkbox in question and possibly consulted an attorney of their own. I am thinking most users absolutely do not care and agree out-right and immediately to all claims presented by Facebook. How is that not still agreement?


You can't claim users don't care about their videos not being deleted--the fact that they do care is exactly why this is in the news. They may click past a screen because they think that they don't care, but that's only because they don't understand the implications of doing so. Part of the reason is that a lot of people naively believe that a respectable company like Facebook wouldn't try to screw them over, and would behave with their best interests in mind.

It's unrealistic to expect that users will read AND understand the TOS of every website AND all of the changes to the TOS that occur over time.


>Secretly from whom?

The vast majority of users.


Huh. So now I'm starting to think: what if I purposely recorded thousands and thousands of meaningless video? None of my friends would ever see them since I never published them, but Facebook would use up hard drive space storing them.

What if a lot of people did that?

Suddenly Facebook's cost for hanging onto all these videos would become quite high with no value in doing so.

Anyone feel like making a website to help automate that process?


The scale you’d need to achieve to have even the most minor effect on facebooks vast infrastructure would be enormous.

I feel the effort you and countless opt-in people would expend could be redirected to much more fruitful efforts. Convincing people to delete their profiles, for example.


I generally agree with other people's posts that there are better uses of your time.

However, if you wanted to do this, why bother recording it? Read a little about the mp4 spec and it would probably be fairly easy to generate mp4s containing random data. You could even go further and generate video that tricks facial/object recognition.


You should upload random-pixel uncompressible videos and delete them over and over again. It not only increases storage cost but makes the profile SNR very low.

There are general-strike level attack surfaces on these networks, but people don't really care that much.


Does their AUP/TOS allow them to lock you out in that case?

Lockout would be worse than account deletion. You would have no recourse to fight back on any of their use of your data right?


> You would have no recourse to fight back on any of their use of your data right?

Yes you would if you were European. You can't stop people exercising their legal rights because they broke your terms of service.


Facebook might actually love this. Because for every 20 useless videos you upload, you may click, comment, or view some facebook message that pops up while you're uploading. It could become a net gain for them.


What if a lot of people did that?

That is exactly what a lot of people do. I don't think Facebook is conspiratorially hording funny cat videos. It's just standard practice to flag things as deleted. Everyone does it. I certainly have.


This (and the GDPR - even though my company is in the US) are why I now tell the developers on my team to not collect info unless they have a definitive use case as to why to store it. And make sure to delete delete data as soon as it is no longer needed.

It helps that my business doesn't monetize via advertising.

My guess is what actually happened here is that they had a use case to store it for a few hours or so (incase, say the user changed their mind about posting it) and no one ever bothered to write a cleanup script because "storage is cheap" and possibly "maybe we might have a use case for it someday"

I can't imagine this being intentionally. Even if I try to consider malicious use cases I can't think of any where it would be beneficial for Facebook to actually store this data besides being too lazy to clean it up.

Edit: Wow. Who new applying Hanlon's razor to this would get me downvoted so badly? I'm going to leave this here unedited and eat the downvotes of the people on an anti-facebook warpath because I think it is important to state that we as people who make tech products often take short cuts (like avoiding dev work because it is cheaper to just keep data in storage) and we need to stop doing that. There is plenty of stuff Facebook does that is beyond the pale but this one is more likely contributed to lazyness, and if you are going to downvote me without explaining why you are not contributing to the conversation.


Please don't break the site guidelines by going on about downvotes.

https://news.ycombinator.com/newsguidelines.html


My apologies, dang. I did not realize and/or forgot that was against the guidelines. Will not happen again.


Thanks!


> I can't imagine this being intentionally. Even if I try to consider malicious use cases I can't think of any where it would be beneficial for Facebook to actually store this data besides being too lazy to clean it up.

Nope. They wrote a routine that makes the video invisible to the actual user but refrained from deleting it right away. That is intentional.


> That is intentional.

It's funny you are so sure of this.

And you essentially just called me wrong without providing a use case. My statement was I couldn't think of a use case for them to do this intentionally and your statement does nothing to disprove that.

Hiding a video can and probably is just an "UPDATE videos SET visible=0 WHERE id=123" And it is extremely common to soft delete things for hard delete later in case the user made a mistake or law enforcement requests it or any number of reasons.

Especially in large distributed systems where things often need to happen asynchronously.

Not permanently deleting a soft delete file is not necessary intentional. Anyone who has ever worked on a large software project knows about backlog stories (say, hypothetically, "free up space from soft deleted videos") not being done for years because other priorities keep pre-empting them.

Similar reason when writing in a garbage collected programming language the memory isn't freed immediately.

Does that mean it was unintentional? Not necessarily but it certainly is plausible that it was unintentional.


Okay, so if it wasn't intentional how else could old videos still be in the system. Videos is not a new feature and the videos date back as far as 2008.

Facebook may be a bad player, but they have tons of talent working for them. Are you seriously suggesting that they stored a decade worth of videos that never saw a single view/download and nobody there realized it?


I'm not on Facebook's side here but if you have to consider deletes in a large storage system then you have to consider fragmentation. I implemented a storage system similar to Facebook's Haystack (which they use for photos and videos) and did exactly the same - mark the replaced or deleted object as such and ignore it until a separate process compacts the stack.

Compaction means copying huge quantities of data around and re-indexing everything in a particular stack, while at the same time maintaining good read performance. It's an expensive operation and not worth it unless you desperately need to reclaim space.

The alternative is to overwrite deleted content, but that carries a performance cost because new files may need breaking up to fit into smaller gaps left by deleted files, leading to IO devices spending more time seeking per-object. Defragmenting such a scheme is even more expensive than compacting a haystack-style scheme.

So yes - the system may not actually destroy the bytes on disk by design. However, it should not report those objects as still being available to layers above it since doing so may lead to inconsistency. This leads me to believe that nothing was actually even marked as deleted, it was simply never "published".


I would think this is common practice.

I know that YouTube, for example, retains videos indefinitely, because I've personally been able to retrieve videos that were deleted in as early as 2006.

It was possible for anyone to do this until some time in 2017, when they started requiring signatures for RTSP streams. All that was needed was the video ID (the eleven characters in every YouTube video URL). Didn't matter if they were private (IDs for these could be enumerated if the channel ID was known), "deleted" over ten years ago, or behind a paywall.

From ~2008 until 2015, you could do the same but with higher quality streams through the now-retired Apple TV API.


I'd have been more surprised if they didn't save them. I always assumed that anything remotely hosted that I "delete" is soft-deleted, and that anything I edit is actually just versioned. (I'm not claiming to be especially smart, just cynical.)


This was a bug. There was an old feature that used to allow you to record and post directly from the browser. Those videos were streamed to FB as they were being recorded. If you decided not to post those draft videos should have been deleted but were not. They showed up in download your information (DYI) as expected because that tool is designed to show you the data Facebook has about you. Thanks to New York Magazine for the flag. If you see anything in DYI that doesn't look right, let us know and we'll investigate. This was a bug, and we really do appreciate any help in finding them so we can fix them.


I downloaded my information and then deleted my account before realizing that the archive I downloaded did not include any of the photos or posts that I had been tagged in, because I made those posts only visible to me on my timeline.


If these are posts by other people that you were tagged in then those posts should still be up, just without the tags of you, on the original posters timeline.


Sure, this just doesn't do me any good because I don't have them archived or any ability to access them :)


Is this really a secret or a surprise? Most SaaS companies of this size don’t really ever delete anything. They set a deletion flag and call it a day.


Yeah for most things this is a good way to do it. For user data when they delete the account it would be ideal if they actually removed it though


Why would that be the good way to do it? Especially if that deletion action was behind a confirmation, or if the data was never recoverable by the user? At that point, just delete delete it.


A lot of reasons, here's a simple example: If you have a "product" which had a CRUD interface to make and users who add it to the card, deleting it would cause relationships to fail - as in if users who already purchased the item or added it to their cart would no longer be able to see it.


I'm shocked. No one saw this coming.

* Facebook reads your private messages, California class-action suit alleges. || https://www.slate.com/blogs/future_tense/2014/01/03/facebook...

* Facebook asks users for nude photos in project to combat 'revenge porn' | Technology | The Guardian || https://www.theguardian.com/technology/2017/nov/07/facebook-...

* Facebook CEO Admits To Calling Users 'Dumb Fucks' || http://gawker.com/5636765/facebook-ceo-admits-to-calling-use...


Someone should mention also that in the downloadable facebook archive, in the html/ folder, there is a file called contact_info.htm and its a pretty large file. It is apparently every google contact you ever had, all synced to facebook, for every device you ever logged into. So, if you ever might have used a friends device to check your facebook account using then facebook app, then all of that persons contacts are there too, as well as their sms history metadata and call history metadata.


I’ve let quite a few people log into the Facebook app on my phone and now I’m pissed.

I was always careful with privacy settings on Facebook, but the thought never even crossed my mind of what I’d be “agreeing” to by letting someone briefly use my phone.

I’m sure someone will come along shortly to tell me I deserve it and should have know better and was asking for it but whatever.

How will the GDPR handle instances like this?


I imagine that the data in those downloads is a fraction of all the data Facebook collects. It seems that they disclose only what is required based on local laws, so it's unlikely they will ever disclose derived data unless forced to (for example, the location data they collect and combine to figure out which people were present at the same party etc.)


Btw why do we think that the downloadable archive contains everything they have? Because they said so?

Just add a bool field to the table "canExport".


I don’t think we think that. Obviously, in the context of HN crowd, FB has no real credibility / ethical compass.


This is no surprise, virtual delete is common design pattern.


This is why I've never really deleted my FB account, I've just deactivated it. If I don't believe they would truly delete the data I might better have it available if I want it


I don't think facebook really deletes any data. You deleting a comment or a picture pretty much means: isvisible=false;


More like visible=false;


Display=none is more appropriate


Maybe visibility:hidden too.


There are people on this very site complaning about how "hard" it is to delete data on request to be GDPR compliant. Highly paid developer experts are literaly throwing a fit when told that they should be able to delete data when "Delete" button is clicked.

It's not just Facebook.


Storage costs next to nothing, recovery costs a lot. Why would facebook who depends on this data for income ever delete it. You get two massive benefits from keeping it and setting a "deleted" flag.

Why would anyone expect anything different. How entitled do people feel they are.


It's quite simple: if someone says they deleted something, you expect them to delete it. If they don't, they are lying, and you can't really trust them with anything else.


Facebook has been embroiled in back to back privacy scandals since it opened and you still trust them? How much simpler does it get, I agree, that's pretty simplistic!


A lot of these revelations are coming from the fact that Facebook allows you to download the information it has stored about you.

There is an exceedingly high chance that the managers are going to notice this. And, rather than doing the right thing by storing less information, they are going to lie about what they have, and put a filter in place regarding what they let you download (a bit) vs. what they actually have (everything).

We need to be nuanced in our approach to the world, but it is becoming increasingly clear that Facebook has created a business model that incentivizes (and maybe depends on) evil behavior.


What I think so many privacy advocates don’t realize while frothing at the mouth is that odds are 99.9999% no one really wants your old photos or data specifically. Surely in large aggregate, but on a macro level you are no more interesting than anyone else. You’re not. You have a delusion of grandeur.

Sure, you might get some targeted ads by data used in aggregate and put you into a group but so what? If I had to see ads, I’d rather them be things I am interested in.


Ah the old nothing to fear argument.


Oh, there’s plenty to fear, this just isn’t it.

The proliferation of cameras and cellphone tracking hooked to state owned machine learning predicting your decisions - which is publicly happening in China and almost certainly quietly happening everywhere else? Terrifying.

Data collection on crap that’s nearly public anyway? Merely a distraction.


It's kind of unsurprising. As soon as you upload some data into another system that you do not fully control, you can't really expect or trust the other party to discard it because you want to.

Unless there's a good an easy way to store that kind of data and share it with End-to-End encryption (and the server NEVER has access to those keys) so that only authorized users can view the plain data, that problem will remain.


In all likelihood, this is simply the tip of the iceberg. I see Twitter's stock is collapsing, due to similar concerns. Nothing is private anymore in this day and age.

And yet - If we're concerned about what Facebook has on us, then just imagine what kind of treasure trove the government sits on.


I’m looking forward to HN letting go of this enthusiastic expectation of FB’s demise and putting forth higher quality articles.

(Note: I feel the same enthusiasm. I just want to read more meaningful, comprehensive articles.)


I considered this a feature. It's interesting to compare true convo vs what's left after people have backtracked and deleted messages. Especially when the convo was daring so to speak. ;)


Well, once you give permission to your data you no longer control how it's stored and shared(at least physically). To make sure you are not caught off guard you should always expect the worst.


Could be non-malicious if they want to prevent redundancy of the same video going up twice or just laziness in engineers who didn't build D in their CRUD.


And this is why I never signed up for social media in the first place. Remember the afage about stuff on the internet never really going away? Yeah, it's true. Another interesting note: some friends hiring in Austin, TX have told me they can't hire in town because almost every kid has a social media containing drunk pics.


who cares they have pics containing drunk pics online? is the employer supposed to dictate my morality?


"They're looking into it."

AKA they will continue harvesting the data but be better about not letting you know what is being harvested.


I've downloaded my zip file to try to verify what's going on in the article

I think I have an idea of what might have happened.

When you add a video to the composer window

One of the requests is https://vupload-edge.facebook.com/ajax/video/upload/requests... (Look it up in the network tab of whatever browser dev tool you are using)

With the response as,

for (;;);{"__ar":1,"payload":{"video_id":"11111111111111","start_offset":0,"end_offset":353662,"skip_upload":false},"bootloadable":{},"ixData":{},"gkxData":{},"lid":"1"}

The video 11111111111111 is now in an "unpublished" state. "unpublished" here meaning it's uploaded to Facebook but not linked to a post yet.

You can verify this by taking that ID and doing the following

https://www.facebook.com/11111111111111/ -> redirects to https://www.facebook.com/phwd/11111111111111/

"Sorry, this content isn't available right now"

Your options now are to either discard the post or publish with a privacy setting which will make the link above available. (Notice I didn't say discard the video, the video is still in an unpublished state)

Now for the archive.

You can verify by going to view-source:fb.com/me in a browser Search for the string "access_token" there will be a long string appended. (e.g. access_token:"EAAAAU...)

With that token go to your archive and roll over one of the links in the video section that has an issue and doesn't appear in the activity log.

file:///Users/phwd/Desktop/facebook-phwd-from-zip/videos/11111111111111.mp4

grab the ID 11111111111111 and do the following

https://graph.facebook.com/11111111111111?access_token=THE_T...

That shows an unpublished video for me, it wouldn't show in your activity log (that's the only part of the story I can agree and can confirm with what I have available)

To delete add the method=delete to the request.

https://graph.facebook.com/v2.9/11111111111111?method=delete...

Response should be

{ "success": true }

The next part would be to verify that the video is deleted from the archive. Since Facebook is still giving me the first download zip, I guess I'll have to wait a while (it's 1 am here so I'm heading to bed) until it resets so I can make it build a new archive and confirm the hunch.

This is just my guess, I'm NOT discounting what the Facebook user encountered. I'm just providing a possible background to how it can happen as well as a solution to deleting the "deleted" video. There is also the chance I might be wrong...

References to confirm for yourself. developers.facebook.com/docs/graph-api/reference/video

Disclosure: I don't work for Facebook, however, I do play with their API a bit.


* This Is Why You Should Delete Facebook Permanently || https://www.huffingtonpost.com.au/entry/delete-your-facebook...

From 2014. Good summary video.


How do I mass-delete all of my Facebook data? Likes, posts, etc. Is there a reputable tool out there that can take care of it for me since Facebook doesn't provide you with one?

I'm generally ok with keeping Facebook just as a contacts list, but I'd rather not have it have anything else.


Storage is so cheap, it wouldn't be surprising they save everything they possibly can.


I can't believe people don't pay attention. Facebook never removed any data generated, they just remove the index from the data. Deleted data is can be more valuable than data that is kept there.


Isn't it kinda obvious that fb 'secretly' stores everything that may be beneficial to the company?. Who is still so naive as to believe that they care about your privacy?


For maximum honesty, they should just rename all of the Delete buttons and text to 'hide' or better yet, 'hide from friends'.

Short, simple and to the point.


It was already known that Facebook burns all user data onto a bluray disc.

How do you erase data that's been permanently burned onto an optical medium? You can't.


The has happened numerous times in the past, especially when they switched to timelines. Everything you thought deleted reappeared.

Basically once in Facebook, always in Facebook.


This is why for those 'special' videos one should always just use a handheld camera and not their cell phone. But the public will never learn.


Lol I'm not impressed by anything at this point. I thought it was clear from the start, they don't care about you, you're the "useds" of facebook, as Stallman would say. But still, every new evidence that comes up confirming this must make into a separate headline, and we'll keep on getting plenty of those until... I don't know. Until it either dies or rebrands itself well enough to pretend all of this never happened, I suppose.



Given any data submitted to FB legally becomes their property, they don’t have any obligation to delete it.


I always believed in the right to be forgotten. That's why I am not on Facebook nor Google Plus


How does being on HN help? Your comments can't be deleted after a certain amount of time (a few hours).


At least Hacker News is transparent about that that behavior.

Facebook et al. are not.


How about youtube? Is there any law specifying website must delete the data user choose to remove?


What you put on the Internet might stay there for ever, is an important lesson.


This can not be a surprise to anyone with a just a bit of skepticism.


Agreed, but most do not have much skepticism. You and I may have known about Facebook and had accounts (or deleted them) for over a decade, however many current users are relative internet novices.


Meanwhile, FB share price is on the up again ha


But still very much down from before the CA story dropped.


Did they save any of the videos that Google Drive have recently deleted?


"surprise"


Heres something that I noticed earlier today that might be of interest:

420 million Facebook profiles uploaded to archive.org

http://www.newshub.co.nz/home/new-zealand/2018/03/the-kiwi-s...


Definitely worth noting as another example of how data can be harvested — but also important to point out it was a third-party created archive made from 2007 and 2010. People will still be (justifiably, IMO) unhappy. Just as they were when YourOpenBook made the risk so blatantly obvious.



that link doesn't work for me... this one appears to be the same thing: http://start.att.net/news/read/category/politics/article/for...


Sorry about that. Its just a brief article.

Heres a no-Javascript version:

    curl -o 1.htm  http://amp.timeinc.net/fortune/2018/03/31/facebook-employees-are-reportedly-deleting-controversial-internal-messages

    sed -i '/<p>/,/<\/p>/!d' 1.htm ;

    firefox file:///1.htm ;


"I do also think that, you know, Facebook has a responsibility to its users to protect their data and not just to protect it but make sure that people understand what data they're producing and whether they own it, who has access to it and when.

And Facebook has failed them, you know, across the board.

And the question now is not just what - you know, what can be done to ensure the security of that data. It's, how can we use this moment to ensure that we're having a broader cultural conversation about the data that we're all creating on Facebook, Google, Amazon, through our phones, et cetera and make sure that the companies are held accountable for it?"

Source:

Facebook co-founder, Chris Hughes

https://www.npr.org/2018/03/30/598208043/should-facebook-use...


What app can put 2 major spy agencies in same app ?

Cia and fbi in facebook app.


"The guy who was showing me around pointed out where they were building "apartments for our people to live".

"So they'll work on campus, they'll eat on campus, they'll socialise on campus and now they'll sleep on campus?" I asked. I wondered whether maybe that was kind of unhealthy. Creepy, even.

My guide looked right at me, and for a moment, his megawatt smile faltered. When he first worked there, it reminded him of the Dave Eggers book, The Circle, he said. Then he started talking about the opportunity to connect the world's people, and I stopped listening."

https://en.wikipedia.org/wiki/The_Circle_(Eggers_novel)

https://en.wikipedia.org/wiki/The_Circle_(2017_film)

Source:

https://www.irishtimes.com/life-and-style/people/jennifer-o-...


While this may sound creepy for a foreigner, getting a decent apartment in the SF Bay Area is really expensive and the commutes are horrible. So in this context, that could be a good thing.


Why does it sound creepy? What percentage of the population, historically, have been "free" in a sense that they don't belong to an organization that exerts some sort of control over their social world? It seems there are many shades and dimensions here...


Actually I think in some ways the lower classes have more psychological freedom here.

I've worked a bunch of minimum wage retail jobs and all the grunts mock the daily "walmart chant." At high status jobs people apparently really buy the "my employer is who I am" thing.


"At the same time, the proles are freer and less intimidated than the middle-class Outer Party: they are subject to certain levels of monitoring but are not expected to be particularly patriotic. They lack telescreens in their own homes and often jeer at the telescreens that they see. "The Book" indicates that is because the middle class, not the lower class, traditionally starts revolutions. The model demands tight control of the middle class, with ambitious Outer-Party members neutralised via promotion to the Inner Party or "reintegration" by the Ministry of Love, and proles can be allowed intellectual freedom because they lack intellect."

https://en.wikipedia.org/wiki/Nineteen_Eighty-Four :)


No, I'm pretty sure this is creepy to non-foreigners as well.


I don’t know. It makes rents higher since companies pay more and it takes units off the market and they can deduct the expense unlike traditional renters.


I think the opposite is more likely to be true. This sounds creepy because it is creepy and the only place it seems reasonable is in the increasingly disconnected culture of Silicon Valley.


Many undergraduate, graduate and postgrads live on campus in housing provided by the university, and get paid by the same university. Soldiers in the military live in barracks, even some who would otherwise have the option to live off base. Even expats working for private energy companies abroad sometimes live in corporate housing campuses.


Universities are not top-down controlled institutions - profs have tenure (i.e. independence) and run their labs at a relatively micro scale, so whatever the criticism of tech mega-corps, it probably doesn't transfer because of those differences.

Soldiers in military often have a hard time adjusting to the freedom of civilian life, and it seems that structure is there out of institutional necessity (i.e. society could not exist without a military to protect it), not for the benefit of the individual soldier. Creating corporations that are both single-leader authoritarian and control more aspects of their employee's lives may not be how we want more of in our society. A lot of successful corporations seem relatively all-consuming for their employees (both small and big corps), so it seems like a subject without a solid path toward consensus.


Yes, and socializing students into broader society is a core topic for "campus life" divisions at universities, and transitioning back to civilian life is a significant challenge for many when discharged from the military. These are closed and walled off ecosystems of human engagement, each with their own challenges and failure modes for human psychology.


> Many undergraduate, graduate and postgrads live on campus in housing provided by the university

Last I checked, universities weren’t busy papering the world in surveillance, pretending that’s a moral prerogative, refusing summons to the House of Commons, characterising whistleblowers with contempt, and then lying when they get caught acting unethically.


Universities have their own police forces, surveillance systems (they know every time you badge in or out somewhere, along with video footage and who knows what else) and when I was in college there was talk of installing metal detectors (thankfully the powers that be realized how idiotic this would have been). I'm not arguing that this makes surveillance OK, but from a perspective of housing, this feels overblown.


Isn't this just a 21st century company town? What's old is new again.


A bit like university.


Except at least at university there's a built in time limit, and the focus of all university time is implicitly "planning for what you do when it's over". There's usually even staff whose roles include helping students transition and adjust out of it, into broader society. The company town, however, is an end unto itself.


Everything's sound in the company town. Next thing you know it'll be really convenient to introduce a system, similar to colleges where you have FBucks on your company ID. You need never venture outside, in fact you can fill your fbucks straight from your paycheck!

Eventually, you forget how much the conversation rate between fbucks and usd is, but dismiss any wierdness as a convenience fee, you're well paid right?

Next thing you know, you can't leave because there's no reasonable housing elsewhere in the city. And so on.

Edit: Even if it doesn't become even more dystopic for the worker, who now lives every moment inside the company and is alienated from their fellow humans, including increasingly the ability to think dissent thoughts (no where is safe even home) the company has succeeded in capturing back most of the money they paid to their workforce. It puts the lie to the notion that the presence of profits filters out to the rest of the city and increases inequality.


Well tbh Asian companies have done this to great success.


I guess that depends on your definition of success.


In my 20's I worked at a megacorp, lived in a corporate dorm and ate at the corporate cafeteria. It's a welcome option and you're perfectly allowed to opt out.

I opted out of the corporate dorm eventually, and so did a handful of others, especially the ones who were from the local area. But it definitely made life simpler for me to transition to a new job and living environment.


I disagree that onsite housing at company HQs is always bad depending on how it's used. Tons of companies are already directly renting housing for interns, newly relocated employees, and then also for shorter terms for people interviewing or on short term business trips.

I imagine doing such is in the long term cheaper than the amount currently spent on hotel services for the above population. But yes, long term on campus housing is a bit weird in this day and age (but many rural communities were built by companies needing to house their employees near the factor).


I always hear criticism of the Japanese model where the employer provides assistance and coordination for employees’ personal affairs. I am spending too much time at work and don’t have time to tend to errands, cleaning, life admin, etc. with the demands placed on professionals these days why shouldn’t the employer bear some responsibility for my personal home life rather than leave us to figure it out like in the West?


I really do not like fb at all!


I have no such duty. I'll teach my kids when they get a bit older, but trying to teach adult friends is an unrewarding exercise in futily (mansplaining even).


Please don't post flamebait to Hacker News! I'm sure you didn't mean to, but you triggered just the sort of umpteenth regurgitation we're hoping to avoid here.

We detached this subthread from https://news.ycombinator.com/item?id=16725399 and marked it off-topic.


>mansplaining even

I am fascinated to know how you are going to wedge gender politics sideways into this completely gender neutral conversation.

I agree that talking data storage strategies to people that aren't interested is unrewarding, but please mansplain to me how explaining technical concepts is mansplaining? Ideally, also define mansplaining in the process.


I believe the label mansplanting is used by non males to describe men teaching something in detail (often the complex idea is over the head of the person making the label comment). For some they would not like to be educated by males and men explaining technical details that no one asked for qualifies. Many males have stopped educating for fear of that label as most thought they were doing a favo(u)r by sharing. Society as a whole is affected.


It’s most often used when an idea is far _under_ the explainee’s head to the point that any explanation is absurdly unnecessary. That is, mansplaining usually refers to explaining while incorrectly assuming the incompetence of women.


You seem not to know what mansplaining is supposed to mean, whether you believe it exists or not.

It just means explaining something to someone who already understands it.


It's simple:

- most folks in tech are men

- most men date women

- in 2018 any attempt a man makes to explain something to a woman is a candidate for being accused of "mansplaining".

I'll say one other thing:

I honestly have no clue if mansplaining has a more technical definition but in general I see most people interpreting it as a man explaining anything to a woman.


It originated with Rebecca Solnit, who had an experience where a man and she were talking and a book came up that he had read. As he's telling her about the book, she points out that she's the author of the book, and he proceeds to continue explaining the book to her.

I certainly see this happening sometimes. Earlier this month I remember sitting in on a meeting where a boy fresh out of college was trying to explain SQL injection to a (female) senior security engineer.

That said, I think that happens to everybody, albeit disproportionately to women. And now mansplaining has other really silly uses as an epithet, like "a male with expertise is telling me something I don't want to hear" or "a male disagrees with something I believe very strongly."


I’ve most often heard it used in the context a man explaining something that didn’t need explaining in the first place. “If you don’t screw this screw in reallll right, it’ll come loose. And you know what happens if it comes loose right? Well first...” etc etc. It’s an assumed and implied ignorance of the issue at hand


My understanding is that the whole point of the term is to denote explanation that is redundant and caused by a man assuming a woman's incompetence. Like some story about a man explaining a book to the female author of the book.


[flagged]


This is not a substantive comment, and it has no place here. If you have an actual argument to make, especially where you can point to reliable evidence, please do.


Cue wave of outraged HNers because some people believed Facebook cared.


Please don't post unsubstantive comments here.


I don't think pointing out how disconnected half of the userbase here is is unsubstantive. 99% of people who use Facebook has not the fainstest idea of what Facebook is doing, why they're doing it, and why it might be wrong. Still, on each of these threads the top comment invariably starts with "why is anyone surprised that.. etc". I could find you links to prove my point, but I'm can't believe you're not aware of this. So I strongly disagree my comment is unsubstantive.


It was unsubstantive by the standard we moderate HN for–shallow and repetitive, made worse by snark and tropiness. Please don't post like that here.

https://news.ycombinator.com/newsguidelines.html


HN users live in this weird Silicon Valley bubble that is divorced from the real world..


Only about 10% of HN users are in Silicon Valley, and many of those are critical of it.


FB is not dropbox, so "deleting" means a different thing.


Maybe, but that’s not the case in this article, which describes an obsoleted video feature.


At what point was any of this news? TOS is clear on this so if this was to bother anyone they would have read that before hand. If I was running a data collection business like google for ad-analytics I wouldn't delete anything either, that's your bottom line your wiping away!


Fortune cookie:

Forgive and forget. Someone must do it and everyone else will not.

https://arstechnica.com/information-technology/2014/01/faceb...


I don't care. I stopped caring about my privacy. Nobody will hurt me by knowing too much about me. Facebook can have all of my life, because nothing is private for me.

It is a great tool for keeping up with friends, it allows to cultivate friendships a lot easier than anything before. We can have a lot of friends when we don't keep secrets, because by being open with your weaknesses you create new friend, not an enemy. You create enemies with secrets and lies.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: