Note: This page is historical.

Current pages about Yenta are here. Please look at those pages first.

Yenta is still under active development, but this particular page is not. If you're interested in current research papers about Yenta, or obtaining a copy of Yenta, please start here instead.

This page is one of many that were written in late 1994 and early 1995, and are being preserved here for historical purposes. If you're viewing this page, you probably found it via an old link or are interested in the history of how Yenta came to be. These pages have not been actively maintained since 1995, so you'll find all sorts of older descriptions which may not match the current system, citations to old papers and old results, and so forth.

Keeping things private: why and how

It is quite commonly the case that new information technologies are deployed without much, if any, consideration to users' privacy concerns, and hence with little or no safeguards built into the architecture of the system. We see these trends every day, whether it is in (often legally questionable) database matching between disparate organizations' data, correlation of supermarket UPC data with individual charge card numbers, or any of a number of similar cases. (A huge number of these are reported, generally in excruciating and horrifying detail, by Risks Digest and the Privacy Forum; this is also a major focus of the Electronic Frontier Foundation.) It is sometimes the case (generally after some truly egregious disaster of compromised privacy) that mechanisms to protect privacy are grafted onto systems already in operation, though this almost never works.

Privacy, of course, means many things to many people. Does it mean security from actively malevolent other users? What about direct-mail advertisers? Or does it mean complete inscrutability from everyone, using strong cryptography, which quickly gets into a very large political can of worms.

Yenta takes the tack that quite strong user privacy has both explicitly political implications, and is required to enable user acceptance of agents that might know a lot about them.

For other viewpoints on this, consult Risks Digest or the Privacy Forum. You might also want to look at some of Bruce Sterling's agitprop, or investigate the discussions of MIT's 6.095/STS095, Ethics and Law on the Electronic Frontier. There are also quite a lot of politics involved with Yenta's use of strong cryptography, that influence its implementation.


Lenny Foner
Last modified: Thu Dec 22 21:09:59 1994