Fair anonymity for the Tor network

Fair anonymity for the Tor network

Jesus Diaz
Escuela Politecnica Superior
Universidad Autonoma de Madrid
David Arroyo
Escuela Politecnica Superior
Universidad Autonoma de Madrid
Francisco B. Rodriguez
Escuela Politecnica Superior
Universidad Autonoma de Madrid
j.diaz@uam.es david.arroyo@uam.es f.rodriguez@uam.es

Current anonymizing networks have become an important tool for guaranteeing users’ privacy. However, these platforms can be used to perform illegitimate actions, which sometimes makes service providers see traffic coming from these networks as a probable threat. In order to solve this problem, we propose to add support for fairness mechanisms to the Tor network. Specifically, by introducing a slight modification to the key negotiation process with the entry and exit nodes, in the shape of group signatures. By means of these signatures, we set up an access control method to prevent misbehaving users to make use of the Tor network. Additionally, we establish a predefined method for denouncing illegitimate actions, which impedes the application of the proposed fairness mechanisms as a threat eroding users’ privacy. As a direct consequence, traffic coming from Tor would be considered less suspicious by service providers.



1 Introduction

Privacy has become a major concern for Internet users. One main approach for enabling privacy is the introduction of anonymizing techniques, and Tor [17] is probably the most popular and widely used anonymizing network.

Tor anonymizes communications by avoiding origin and recipient to be linked. Moreover, even the recipient cannot learn the IP address of the originator by analyzing the received packets. This is achieved by re-routing the data through several intermediaries, the Onion Routers, and adding an extra layer of encryption with each one. Nevertheless, this also reduces the protection available for the addressee, since it cannot denounce the originator or even block him. This is certainly a factor hindering any wide acceptance of anonymizing networks. Moreover, it causes users acting legitimately to be affected by the illegitimate actions of others. For instance, in some situations legitimate users cannot access a site through Tor, because that site directly bans Tor-originated traffic.

In this work we use group signatures to extend the functionality of Tor’s entry and exit nodes in order to enable the tracing and blocking of misbehaving users. This being the case, we implement an access control mechanism for Tor which does not deteriorate the normal use of the Tor network by users acting legitimately. As a consequence of this fairness111According to [22], with fairness we refer to the capability of taking any measure that could prevent anonymity misuses. mechanism, service providers would probably increase their trust in Tor, since illegitimate actions coming from Tor would presumably be reduced.

Section 2 summarizes some data evidencing that irrevocable anonymity and unlinkability may be seen as a threat, or something undesirable, in some situations, and in Section 3, we summarize some related work. We then describe our approach for incorporating fairness into the Tor network in Section 5. Section 6 points out to some open issues that are important to take into account for realizing our proposal. Finally, Section 7 concludes with a summary of our proposal. Throughout this short paper we assume a basic knowledge of Tor essentials. For a more detailed description of Tor, we refer to [17] or the Tor project website222https://www.torproject.org.

2 Mistrust in complete anonymity

Despite the fact that an obvious use of communications anonymity is to protect users’ privacy, it can also be misused. For instance, in [24] the authors run a Tor exit node for gathering statistical evidence for a further analysis of Tor’s traffic. They conclude stating that it is not uncommon to see “hacking attempts, allegations of copyright infringements, and bot network control channels” routed through Tor. Another type of behavior that may be considered as undesirable is the use of the Tor platform for purposes other than the originally intended ones, even though they might not be against any ethical rule. This is reported in [10] as a result of analyzing the traffic going through several Tor exit nodes, and concluding that an important share of the packets being routed through Tor corresponds to BitTorrent traffic (roughly the 25%). This reflects that in the case of willing to limit this use of Tor, little more than actually blocking all the BitTorrent traffic could be done. This would be unfair to users routing BitTorrent traffic through Tor but consuming a moderate portion of the available bandwidth. Hence, a finer control on this type of “misbehavior” is also desirable. More evidence supporting the claim that the anonymity provided by Tor can be (and is being) misused is the fact that there already exist services, like BlockScript333http://www.blockscript.com/features.php., which include explicit functionality for blocking traffic, including data coming from Tor and other anonymizing networks (BlockScript, offers a commercial service for blocking “unwanted” traffic and sells the raw blacklist data for $12,000 per year). An interesting discussion about the necessity of accountability in anonymous communication systems is done in [14]. Finally, the risk of failure due to websites blocking Tor has also been considered by the Tor staff too in a recent post in the Tor Project blog: “A call to arms: Helping Internet services accept anonymous users”444https://blog.torproject.org/blog/call-arms-helping-internet-services-accept-anonymous-users..

Furthermore there is an abuse specific FAQ in the official Tor website555https://www.torproject.org/docs/faq-abuse.html.en. dealing with the subject.

3 Related work

Several systems have been proposed to endow anonymizing platforms with fairness mechanisms. BLAC [29] makes use of a specific group signature scheme in order to allow service providers to manage their own blacklists, following their own judgment when it comes to block users. Nymble [30] creates a complex infrastructure in order to provide fairness by revoking the unlinkability of users who misbehave. In PEREA [2], the need of Trusted Third Parties is eliminated at the cost of creating a highly crafted infrastructure in order to allow users to be blocked. EPID [7] makes unlinkability revocation possible by either using a member private key, a signature issued by the member to revoke, or by consulting the issuer of the member key. However, it requires the usage of Trusted Privacy Modules [28], which we consider out of the scope of our proposal.

These systems also take into account that different misbehaviors are possible and, in the case of PEREA, that each misbehavior should be assigned a different severity (the PEREA-Naughtiness variant). Nevertheless, they would probably be too rigid for multi-purpose contexts where different legitimate uses and revocation needs are possible. For instance, Nymble just supports unlinkability revocation during a predefined interval. BLAC is tied up to a specific group signature scheme, that might not be appropriate for some situations. PEREA limits the number of authentications that users may perform during a time span, blacklisting them if they exceed that limit. Hence, any change on the implemented functionality that may be necessary for the adoption of these systems into a new context would probably require a great effort.

Our proposal takes advantage of the wide variety of group signatures and the standardization process that anonymous certificates based on them are being subject to [3, 15, 8, 20]. These would allow a high flexibility for modifying the desired functionality depending on the context and, at the same time, the minimization of the deployment costs.

4 Building blocks

We use the notation to describe a two-party process between parties and , where (resp. ) is the output to party (resp. ), is the common input, and (resp. ) is ’s (resp. ’s) private input; when party does not have output, we sometimes write . Single-party processes are denoted by , with input and output . Also, for readability, we omit the publicly known keys in the process calls. Finally, since we continuously deal with different types of signatures, we use the Greek letter for denote a signature produced by some of the schemes described below (the specific type is clear given the notation in the following subsections).

4.1 Group signatures

Group signatures, first proposed by Chaum and Van Heyst [12], allow a member of a group to issue a signature such that any possible verifier can check that it has been issued by a member of the group, without revealing which specific member issued it. More advanced schemes have been proposed since then [9, 22, 23], improving the scalability and efficiency of group signatures, but also their functionality. Current schemes allow unlinkability revocation and anonymity revocation. Normally, these revocations can only be performed by suitable authorities, but there exist schemes that introduce interesting variants to this behavior, allowing unlinkability revocation only for users who have broken some rule (like exceeding the maximum number of messages to sign [27]).

To summarize, a group of a group signature scheme is basically composed by: a group manager who owns a secret group manager key and publishes a group key ; and a set of members each in possession of a member key , for . Overall, the main operations supported by a group signature scheme may be summarized as follows:


Run by the group manager, creates the group key and the manager key.


Executed jointly between a new user and the group manager , allows new users to join the group, obtaining a member key.


A user creates a group signature over a message, using her member key.


Allows anyone with the group key to verify a group signature.


Run by the group manager, allows him to obtain de-anonymize to some extent the issuer of a group signature, given the group signature itself, the manager key and, normally, some additional private information. In some schemes, the only possibility is to retrieve her real identity. In others, it is possible to obtain a token which permits to link group signatures made by the same user. For simplicity, in this work we encompass all variants that somehow reduce anonymity under the term Open.


Allows to verify if a given group signature has been issued by some arbitrary member. In order to run this operation, GS.Open needs to be executed before. Thus, note that there may be different ways to achieve this, according to the variants of the GS.Open functionality.

However, this can be considered a “general working mode” of a group signature scheme. For a good review of the main advancements in the field of group signatures, we refer to [23, Section 1.1].

4.2 Blind signatures

Blind signatures where introduced by Chaum in [11]. Basically, a blind signature scheme allows a user to obtain a signature from a signer over any arbitrary message , but without learning anything about . This idea has been used since then for creating diverse privacy respectful systems. However, since the signer does not learn any information about the message, systems based on them can easily be abused. For solving this issue, fair blind signatures were proposed in [26]. In this variant, an authority has privileged information allowing the signer to link message and signature pairs. Restrictive blind signatures [6] allow issuing blind signatures, but only choosing among messages that comply certain rules. Finally, an also important alternative is given by partially blind signatures [1]. In a partially blind signature the messages are divided in two parts: a common message to which the has complete access; and the blinded message, of which does not learn anything. Thus, the common message may be employed to implement misuse prevention mechanisms. As always, several schemes have appeared improving the overall efficiency, reducing the size of the final signatures, or based on different number theory problems [21, 13, 25, 5].

Although some of the variants of blind signatures would probably be useful for our proposal, we use the general definition of a blind signature for describing it. Thus, hereafter we will use the following notation when referring to the operations of a blind signature scheme:


Creates the signer’s public and private keys for issuing blind signatures.


Using some random secret value, the user creates a blinded version () of the message to be blindly signed and a proof of correctness .


Upon receiving the blinded messages, the signer runs any necessary verification and creates a blinded signature using its private key.


The user receives the blinded signature and unblinds it, using the secret value generated during the blind process. The result of this operation is the final signature.


Any entity runs this operation to verify the signature.

4.3 Blind group signatures

A blind group signature scheme is just like a blind signature in which the signer issues a group signature instead of a conventional signature. Therefore, each of the operations described in the previous sections may be independently applied in this schemes. However, for the sake of clarity, when referring to this schemes, we will use the prefix BGS instead of GS or BS.

4.4 Additional cryptographic primitives

Besides the primitives introduced above, we assume readers are familiar with public-key encryption, digital signature and commitment schemes, and zero-knowledge proofs of knowledge. We use to denote a commitment com to a message , where the sender uses uniform random coins ; the sender can open the commitment by sending to the receiver. We use and to refer to creating non-interactive proof showing that the statement is in the language (which will be determined by the context) with the witness , and to verifying the statement based on the proof .

5 Guidelines for incorporating
fairness into Tor

In order to endow Tor with fairness capabilities, the entry and exit nodes take a central role, since they are the only nodes who learn the IP addresses of the user entering the network and that of the final destination, respectively. Hence, their knowledge would be necessary to determine whether the IP trying to access the network has already been blocked, or to demonstrate that a given origin IP has accessed certain destination IP. However, when proposing modifications of those nodes we must avoid enabling attacks based on establishing a connection between them. For that purpose, we take advantage of both the way the user negotiates keys with the Tor nodes, and the properties of group and blind signatures.

Hereafter, we assume that a group has already been set up, and that there is a suitable policy established for fairly managing revocation (see Section 6). Similarly, we assume that the blind signature scheme has also been set up. Table 1 summarizes the notation used throughout the rest paper, along with some notation inherited from the description of the Tor network [17] and the notation defined for group and blind signatures and the additional cryptographic primitives in Section 4.

Symmetric encryption with key
Asymmetric encryption with A’s public key.
Layered encryptions following the Tor protocol.
Application of a cryptographic hash function.
The user’s Diffie-Hellman share.
The Diffie-Hellman share corresponding to a Tor node.
A transcription of the handshake for key .
concatenated with .
Group signature of sent to entry node.
Group signature of sent to exit node.
Blinded version of
Blindly signed version of
Blind signature of
Table 1: Notation summary.

Our approach works by introducing variations in the way a user negotiates the symmetric keys with the entry and exit nodes. In short, we will require the user to group-sign the message sent during negotiation with the entry and exit nodes. In addition, in order to prevent the user to employ one identity for negotiating with the entry node, and a different one with the exit node (see Section 6), the entry node has to blindly sign the message that the user will send to the exit node. The resulting modified handshake schemes (see [17, p. 6]) are shown below, where  denotes any arbitrary user, EN denotes the entry node and EX the exit node. During the handshake with EN,  first group-signs and , sends to EN and also requests EN to blindly sign a group signature of . If all the operations succeed, ENaccepts the connection.

Entry Node Handshake:
: GS.Sign
: GS.Sign
: BGS.Blind
: where
such that:
EN: VerifyZK()
EN: GS.Verify
EN: BGS.Sign
EN :
: BGS.Unblind

When  initiates the handshake with EX, she sends the group signature on that was blindly signed by EN, along with the blind signature itself. If all the verifications succeed, then EX accepts the connection.

Exit Node Handshake:
EX: GS.Verify
EX: BGS.Verify
EX :

It is important to note that the group signatures are encrypted using the public keys of either the entry or exit nodes. Hence, only the entry and exit nodes learn them. Moreover, the group signature sent to the exit node is blindly signed by the entry node. Thus, even if both nodes collude, they would not be able to determine by themselves that the group signatures they have received have been issued by the same user, due to the unlinkability property of the group signature scheme and the blindness property of the blind signature scheme. Moreover, since the group signature sent to the exit node has been blindly signed by the entry node, it is not possible for a user  to frame another user .

The modified key negotiation with the entry node is depicted in Fig. 1, and the one corresponding to the exit node is depicted in Fig. 2.





Figure 1: The user sends to the entry node a group signature of her share of the key, encrypted with the node’s public key, and a blinded version of the group signature to be sent to the exit node. The entry node returns a blindly signed version of the latter.












Figure 2: The user sends to the exit node a group signature of her share of the key, encrypted with the node’s public key and the blind signature issued by the entry node.

5.1 How to block misbehaving users

Let us assume that some user   has been revoked due to some illegitimate behavior. When   tries to establish a circuit, he/she will need to perform a handshake with the chosen Tor entry node. Hence, upon receiving the first message with the group signature, the entry node will verify the received group signature, checking whether or not the member who issued it has been revoked. Given that the member key of   has been revoked, the verification will fail, and the entry node will reject the connection. Note that if the user has not been revoked, the privacy guarantees provided by Tor are not diminished.

5.2 How to denounce misbehaving users

In this case, we assume that   has already established a circuit and she is communicating with some server (external to Tor). Also, let us suppose that eventually,   performs some illegitimate action. When that happens, denounces this behavior following some predefined method. If deemed appropriate, the group signature received by the exit node during the handshake may be used to retrieve ’s identity, or to trace her. Specifically, the exit node provides the following information:

  • , where is the message received and denounced by , and is the symmetric key negotiated between   and the exit node.

  • , where is ’s share of the handshake and is the share created by the exit node.

  • , i.e., a group signature of issued by .

In order to verify that the received denounce is valid, it is necessary to check that the message received from , , corresponds to the encryption received from the exit node. Also, must be a valid group signature over . Finally, the exit node may be required to prove that it knows the discrete logarithm of to the base . If these checks succeed, then the member with key () is responsible of . Hence, ’s key can be consequently revoked, and the circuit may be closed by the exit node. Note that subsequent attempts made by  to establish a circuit would be blocked by the entry node, since the member key of   has been revoked.

5.3 Additional remarks

A few remarks are worth to be made, concerning “special” situations and subjects that should be taken into account.

Tor bridges

Our proposal is directly extensible to support Tor bridges, considering Tor bridges as the entry nodes to the Tor network.

Leaky pipe topology

This approach allows a client to output some packets through a node other than the negotiated exit. In order to adapt our approach to this exception, the node leaking the packet should follow ad-hoc the procedure defined for exit nodes.


Since hosts being accessed through these fair Tor extension might want to issue denounces against traffic originated from Tor, logging the information necessary for solving disputes is required. Hence, the exit nodes need to keep the information specified in Section 5.2 (this also applies to the leaky pipe extension). An appropriate policy for logging should be established.

Denunciation time span

Considering that our proposal requires Tor exit nodes (and circumstantially other nodes) to log several pieces of information, it would comprise a serious scalability problem if this logging would be expected to last too much time. Hence, it seems appropriate to establish a predetermined time span for accepting denounces. Upon expiration of that time span, all the logged information could be removed, and any subsequent denounce related to that information rejected. This would require possible complainants to be aware of this time limitation.

6 Open issues

In the scheme given in Section 5 we just use the general definitions of the building blocks for defining our system. The analysis of which specific variants should be employed is left as future work. Note that this is a very delicate decision, since different options offer different privacy properties. Moreover, we may even need different schemes depending on who issues the signatures (e.g. group signatures are issued both by users and entry points in our proposal). Thus, given the sensitivity of the information managed by Tor, this is an issue that needs to be studied in depth by itself. For that matter, the extensible group signatures library libgroupsig [16] may offer interesting features. In addition, concerning the blind signatures, it would probably be necessary to use some of its variant to prevent circumventing the controls explaining above. Namely, with the previous bare scheme, a user could use the same blind signature indefinitely. This may simply be solved by using partially blind signatures, and having the entry node introduce a lifetime value for the blind signature as common message.

Another important issues are determining when misbehaving users should be revoked, and by whom. The former question would probably depend on the websites (or service) being accessed through Tor. For the latter, a probably good solution given Tor’s infrastructure would be to apply threshold schemes to the revocation procedures (see [4]), such that a majority of the authorities participating in the network consensus need to agree for revoking users.

Finally, note that Sybil attacks [18] are partly addressed by forcing users to use the same member key for the group signature sent to the entry node and for the group signature sent to the exit node (and having the latter to be blindly signed by the entry node). However, some additional mechanism should be included for preventing users from arbitrarily generating new member keys. Since asking users to register may not be well received (it may be seem as a threat to anonymity), requesting them to perform some proof of work [19] during the generation of the member keys may be a good alternative.

7 Conclusion

In this work we have proposed an extension to the Tor network in order to endow it with the functionality for preventing misbehaving users to access the network. We expect such functionality to increase the trust of websites in Tor and thus prevent them to block users coming from it. This extension follows the design of Tor, and does not require any modification to its infrastructure. It works by including two group signatures in the key negotiation processes with the entry and exit nodes (one for each) and having the entry node blindly sign the group signature to be sent to the exit node. The group signature sent to the exit node allows service providers to denounce illegitimate actions. Once the unlinkability of a user has been revoked as a consequence of some illegitimate behavior, any entry node would be able to block that specific user just by checking if it is included in an (unlinkability) Revocation List.


  • [1] M. Abe and E. Fujisaki. How to date blind signatures. In ASIACRYPT, pages 244–251, 1996.
  • [2] M. H. Au, P. P. Tsang, and A. Kapadia. PEREA: Practical TTP-free revocation of repeatedly misbehaving anonymous users. ACM Trans. Inf. Syst. Secur., 14(4):29, 2011.
  • [3] V. Benjumea, S. G. Choi, J. Lopez, and M. Yung. Anonymity 2.0 - X.509 extensions supporting privacy-friendly authentication. In CANS, pages 265–281, 2007.
  • [4] V. Benjumea, S. G. Choi, J. Lopez, and M. Yung. Fair traceable multi-group signatures. In Financial Cryptography, pages 231–246, 2008.
  • [5] O. Blazy, G. Fuchsbauer, D. Pointcheval, and D. Vergnaud. Short blind signatures. Journal of Computer Security, 21(5):627–661, 2013.
  • [6] S. Brands. Untraceable off-line cash in wallets with observers (extended abstract). In CRYPTO, pages 302–318, 1993.
  • [7] E. Brickell and J. Li. Enhanced Privacy ID: A direct anonymous attestation scheme with enhanced revocation capabilities. Dependable and Secure Computing, IEEE Transactions on, 9(3):345–360, 2012.
  • [8] J. Camenisch, I. Krontiris, A. Lehmann, G. Neven, C. Paquin, K. Rannenberg, and H. Zwingelberg. D2.1 architecture for attribute-based credential technologies - version 1, 2011.
  • [9] J. Camenisch and A. Lysyanskaya. Dynamic accumulators and application to efficient revocation of anonymous credentials. In CRYPTO, pages 61–76, 2002.
  • [10] A. Chaabane, P. Manils, and M. A. Kâafar. Digging into anonymous traffic: A deep analysis of the tor anonymizing network. In NSS, pages 167–174, 2010.
  • [11] D. Chaum. Blind signatures for untraceable payments. In CRYPTO, pages 199–203, 1982.
  • [12] D. Chaum and E. van Heyst. Group signatures. In EUROCRYPT, pages 257–265, 1991.
  • [13] X. Chen, F. Zhang, Y. Mu, and W. Susilo. Efficient provably secure restrictive partially blind signatures from bilinear pairings. In Financial Cryptography, pages 251–265, 2006.
  • [14] C. Diaz and B. Preneel. Accountable anonymous communication. In Security, Privacy, and Trust in Modern Data Management, Data-Centric Systems and Applications, pages 239–253. Springer Berlin Heidelberg, 2007.
  • [15] J. Diaz, D. Arroyo, and F. B. Rodriguez. New x.509-based mechanisms for fair anonymity management. Computers & Security, 46:111–125, 2014.
  • [16] J. Diaz, D. Arroyo, and F. B. Rodriguez. libgroupsig: an extensible c library for group signatures. submitted, 2015.
  • [17] R. Dingledine, N. Mathewson, and P. F. Syverson. Tor: The second-generation onion router. In USENIX Security Symposium, pages 303–320, 2004.
  • [18] J. R. Douceur. The sybil attack. In Peer-to-Peer Systems, First International Workshop, IPTPS 2002, Cambridge, MA, USA, March 7-8, 2002, Revised Papers, pages 251–260, 2002.
  • [19] C. Dwork and M. Naor. Pricing via processing or combatting junk mail. In Advances in Cryptology - CRYPTO ’92, 12th Annual International Cryptology Conference, Santa Barbara, California, USA, August 16-20, 1992, Proceedings, pages 139–147, 1992.
  • [20] ISO/IEC. ISO/IEC CD 20008-1.2: Information technology - security techniques - anonymous digital signatures - part 1: General, 2012.
  • [21] W.-S. Juang and C.-L. Lei. Partially blind threshold signatures based on discrete logarithm. Computer Communications, 22(1):73–86, 1999.
  • [22] A. Kiayias, Y. Tsiounis, and M. Yung. Traceable signatures. In EUROCRYPT, pages 571–589, 2004.
  • [23] B. Libert, T. Peters, and M. Yung. Group signatures with almost-for-free revocation. In CRYPTO, pages 571–589, 2012.
  • [24] D. McCoy, K. S. Bauer, D. Grunwald, T. Kohno, and D. C. Sicker. Shining light in dark places: Understanding the Tor network. In Privacy Enhancing Technologies, pages 63–76, 2008.
  • [25] T. Okamoto. Efficient blind and partially blind signatures without random oracles. In TCC, pages 80–99, 2006.
  • [26] M. Stadler, J.-M. Piveteau, and J. Camenisch. Fair blind signatures. In EUROCRYPT, pages 209–219, 1995.
  • [27] I. Teranishi, J. Furukawa, and K. Sako. k-times anonymous authentication (extended abstract). In ASIACRYPT, pages 308–322, 2004.
  • [28] Trusted Computing Group. TCG TPM Specification 1.2. http://www.trustedcomputinggroup.org, 2003.
  • [29] P. P. Tsang, M. H. Au, A. Kapadia, and S. W. Smith. Blacklistable anonymous credentials: blocking misbehaving users without ttps. In ACM Conference on Computer and Communications Security, pages 72–81, 2007.
  • [30] P. P. Tsang, A. Kapadia, C. Cornelius, and S. W. Smith. Nymble: Blocking misbehaving users in anonymizing networks. IEEE Trans. Dependable Sec. Comput., 8(2):256–269, 2011.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description