At one company, the CEO said AI tools in general should not be used, due to fear of invalidating a patent application in progress after the lawyer said it must be kept secret except with NDA partners. I explained that locally run LLMs don't upload anything, so those are ok. This is a company that really needs better development velocity, and is buried alive in reports that need writing,
On the other hand, at another company, where the NDAs are stronger and more one-sided, and there's a stronger culture of code silos, "who needs to know" governing read access to individual code repos, even for mundane things like web dashboards, and higher security in general, I expected nobody would be allowed to use these tools, yet I saw people talking about their Copilot and Cursor use openly on the company Slack.
There was someone sitting next to me using Cursor yesterday. I'd consider hiring them, if they're interested, but there's no way they're going to want to join a company that forbids using AI tools that upload code being worked on.
So I don't think companies are particularly consistent about this at the moment.
(Perhaps Apple's Private Cloud Compute service, and whatever equivalents we get from other cloud vendors, will eventuall make a difference to how companies see this stuff. We might also see some interesting developments with fully homomorphic encryption (FHE). That's very slow, but the highly symmetric tensor arithmetic used in ML has potential to work better with FHE than general purpose compute.)
On the other hand, at another company, where the NDAs are stronger and more one-sided, and there's a stronger culture of code silos, "who needs to know" governing read access to individual code repos, even for mundane things like web dashboards, and higher security in general, I expected nobody would be allowed to use these tools, yet I saw people talking about their Copilot and Cursor use openly on the company Slack.
There was someone sitting next to me using Cursor yesterday. I'd consider hiring them, if they're interested, but there's no way they're going to want to join a company that forbids using AI tools that upload code being worked on.
So I don't think companies are particularly consistent about this at the moment.
(Perhaps Apple's Private Cloud Compute service, and whatever equivalents we get from other cloud vendors, will eventuall make a difference to how companies see this stuff. We might also see some interesting developments with fully homomorphic encryption (FHE). That's very slow, but the highly symmetric tensor arithmetic used in ML has potential to work better with FHE than general purpose compute.)