|
@@ -1,8 +1,8 @@
|
|
|
+\documentclass[twocolumn]{article}
|
|
|
|
|
|
+\title{Challenges in bringing low-latency stream anonymity to the masses (DRAFT)}
|
|
|
|
|
|
-
|
|
|
-Challenges in bringing low-latency stream anonymity to the masses
|
|
|
-
|
|
|
+\begin{document}
|
|
|
|
|
|
\section{Introduction}
|
|
|
|
|
@@ -27,7 +27,7 @@ useful network to a practical useful anonymous network.
|
|
|
|
|
|
Tor works like this.
|
|
|
|
|
|
-weasel's graph of # nodes and of bandwidth, ideally from week 0.
|
|
|
+weasel's graph of \# nodes and of bandwidth, ideally from week 0.
|
|
|
|
|
|
Tor has the following goals.
|
|
|
|
|
@@ -100,7 +100,7 @@ continued money, and they periodically ask what they will do when it
|
|
|
dries up.
|
|
|
|
|
|
Logging. Making logs not revealing. A happy coincidence that verbose
|
|
|
-logging is our #2 performance bottleneck. Is there a way to detect
|
|
|
+logging is our \#2 performance bottleneck. Is there a way to detect
|
|
|
modified servers, or to have them volunteer the information that they're
|
|
|
logging verbosely? Would that actually solve any attacks?
|
|
|
|
|
@@ -172,3 +172,5 @@ assuming that, how much anonymity can we get. we're not here to model or
|
|
|
to simulate or to produce equations and formulae. but those have their
|
|
|
roles too.
|
|
|
|
|
|
+\end{document}
|
|
|
+
|