The Internet is a network, or more accurately: it is a network of connected networks.
The World Wide Web is a cyberspace that exists upon the Internet… or “via”(?) the Internet. – (I don’t think our language, our human English, has words to accurately describe how networks and cyberspaces are related; for example, your brain is a neural network, or, more accurately, it is a network of networks; your consciousness (your thoughts, your memories, your emotions… your “content”(?)) exists “upon”(?) your brain. You see?: our language doesn’t really describe this.)
The Deep Web is content of the World Wide Web that is not accessible through standard search engines.
A darknet is a network that is built on top of, or overlaid upon, the Internet, and to which access is restricted, requiring, for example, special software or authorization.
The Dark Web is content that exists on darknets.
A darknet market is a marketplace that exists on a darknet.
The Internet and darknets are networks; darknets overlay the Internet.
Considering the World Wide Web as content on the Internet, the World Wide Web can be divided into two subsets: the Surface Web and the Deep Web. The Surface Web is content that is accessible through standard search engines. The Deep Web is content that is not accessible through standard search engines, and it is much, much larger than the Surface Web, but it is mainly composed of information in databases (databases are generally not searchable through standard search engines). However, some content of the Deep Web is not accessible by standard search engines because it exists upon darknets, and that content is called the Dark Web.
The World Wide Web is the superset; the Deep Web is a subset of the World Wide Web; the Dark Web is a subset of the Deep Web; and darknet markets form a subset of the Dark Web.
⁓The Voice before the Void
“Deep web (search)”
This article is about the part of the World Wide Web not indexed by traditional search engines. For the part of the World Wide Web which exists on Darknets, see Dark Web.
The Deep Web, Deep Net, Invisible Web, or Hidden Web are search terms referring to the content on the World Wide Web that is not indexed by standard search engines. Computer scientist Michael K. Bergman is credited with coining the term in 2000.
1. Terminology conflation
The first conflation of the terms came about in 2009 when Deep Web search terminology was discussed alongside illegal activities taking place on the darknet Freenet.
In subsequent media reporting about the darknet market Silk Road, many commentators and media outlets have taken to using the term “Deep Web” synonymously with the terms “Dark Web” and “Darknet,” a comparison BrightPlanet Corporation rejects as inaccurate, and consequently is an ongoing source of confusion. Wired reporters Kim Zetter and Andy Greenberg recommend the terms be used in distinct fashions.
In the year 2000, Michael K. Bergman said how searching on the Internet can be compared to dragging a net across the surface of the ocean: a great deal may be caught in the net, but there is a wealth of information that is deep and therefore missed. Most of the web’s information is buried far down on sites, and standard search engines do not find it. Traditional search engines cannot see or retrieve content in the Deep Web. The portion of the web that is indexed by standard search engines is known as the Surface Web. As of 2001, the Deep Web was several orders of magnitude larger than the Surface Web. An analogy of an iceberg has been used to represent the division between Surface Web and Deep Web.
It is impossible to measure, and hard to put estimates on, the size of the deep web because the majority of the information is hidden or locked inside databases. Early estimates suggested that the deep web is 400 to 550 times larger than the surface web. However, since more information and sites are always being added, it can be assumed that the deep web is growing exponentially at a rate that cannot be quantified.
Estimates based on extrapolations from a study done at University of California, Berkeley in 2001 speculate that the deep web consists of about 7.5 petabytes. More accurate estimates are available for the number of resources in the deep web: research of He et al. detected around 300,000 deep web sites in the entire web in 2004, and, according to Denis Shestakov, around 14,000 deep web sites existed in the Russian part of the Web in 2006.
3. Non-indexed content
Bergman, in a seminal paper on the Deep Web published in 2001 in The Journal of Electronic Publishing, mentioned that Jill H. Ellsworth used the term Invisible Web in 1994 to refer to websites that were not registered with any search engine. Bergman cited a January 1996 article by Frank Garcia:
It would be a site that’s possibly reasonably designed, but they didn’t bother to register it with any of the search engines. So, no one can find them! You’re hidden. I call that the invisible Web.
Another early use of the term “Invisible Web” was by Matthew B. Koll of Personal Library Software in a December 1996 press release.
The first use of the specific term “Deep Web,” now generally accepted, occurred in the aforementioned Bergman study.
4. Content types
Methods which prevent web pages from being indexed by traditional search engines may be categorized as one or more of the following:
* Dynamic content: dynamic pages which are returned in response to a submitted query or accessed only through a form, especially if open-domain input elements (such as text fields) are used; such fields are hard to navigate without domain knowledge.
* Unlinked content: pages which are not linked to by other pages, which may prevent web crawling programs from accessing the content. This content is referred to as pages without backlinks (also known as inlinks). Also, search engines do not always detect all backlinks from searched web pages.
* Private Web: sites that require registration and login (password-protected resources).
* Contextual Web: pages with content varying for different access contexts (for example: ranges of client IP addresses or previous navigation sequence).
* Limited access content: sites that limit access to their pages in a technical way (for example: using the Robots Exclusion Standard or CAPTCHAs, or no-store directives which prohibit search engines from browsing them and creating cached copies).
* Non-HTML/text content: textual content encoded in multimedia (image or video) files or specific file formats not handled by search engines.
* Software: certain content is intentionally hidden from the regular Internet, accessible only with special software, such as Tor, I2P, or other darknet software. For example, Tor allows users to access websites using the .onion host suffix anonymously, hiding their IP address.
* Web archives: web archival services such as the Wayback Machine enable users to see archived versions of web pages across time, including websites which have become inaccessible, and are not indexed by search engines such as Google.
5. Indexing methodologies
While it is not always possible to directly discover a specific web server’s content so that it may be indexed, a site potentially can be accessed indirectly (due to computer vulnerabilities).
To discover content on the web, search engines use web crawlers that follow hyperlinks through known protocol virtual port numbers. This technique is ideal for discovering content on the Surface Web but is often ineffective at finding Deep Web content. For example, these crawlers do not attempt to find dynamic pages that are the result of database queries due to the indeterminate number of queries that are possible. It has been noted that this can be (partially) overcome by providing links to query results, but this could unintentionally inflate the popularity for a member of the Deep Web.
DeepPeep, Intute, Deep Web Technologies, Scirus, and Ahmia are a few search engines that have accessed the Deep Web. Intute ran out of funding and is now a temporary static archive as of July 2011. Scirus retired near the end of January 2013.
Researchers have been exploring how the Deep Web can be crawled in an automatic fashion, including content that can be accessed only by special software such as Tor. In 2001, Sriram Raghavan and Hector Garcia-Molina at Stanford University presented an architectural model for a hidden-web crawler that used key terms provided by users or collected from the query interfaces to query a web form and crawl the Deep Web content. Alexandros Ntoulas, Petros Zerfos, and Junghoo Cho of UCLA created a hidden-web crawler that automatically generated meaningful queries to issue against search forms. Several form query languages (for example: DEQUEL) have been proposed that, besides issuing a query, also allow extraction of structured data from result pages. Another effort is DeepPeep, a project of the University of Utah sponsored by the National Science Foundation, which gathered hidden-web sources (web forms) in different domains based on novel focused crawler techniques.
Commercial search engines have begun exploring alternative methods to crawl the Deep Web. The Sitemap Protocol (first introduced by Google in 2005) and mod_oai are mechanisms that allow search engines and other interested parties to discover Deep Web resources on particular web servers.
In 2008, to facilitate users of Tor hidden services in their access and search of a hidden .onion suffix, Aaron Swartz designed Tor2web—a proxy application able to provide access by means of common web browsers. Using this application, Deep Web links appear as a random string of letters followed by the .onion top-level domain.
This article is about networking technology. For websites that exist on top of this technology, see Dark Web.
A darknet (or dark net) is an overlay network that can only be accessed with specific software, configurations, or authorization, often using non-standard communications protocols and ports. Two typical darknet types are friend-to-friend networks (usually used for file sharing with a peer-to-peer connection) and anonymity networks such as Tor via an anonymized series of connections.
The reciprocal term for an encrypted darknet is clearnet or Surface Web when referring to search engine indexable content.
Originally coined in the 1970s to designate networks which were isolated from ARPANET (which evolved into the Internet) for security purposes, darknets were able to receive data from ARPANET but had addresses which did not appear in the network lists and would not answer pings or other inquiries.
The term gained public acceptance following publication of “The Darknet and the Future of Content Distribution,” a 2002 paper by four employees of Microsoft who argued that the presence of the darknet was the primary hindrance to the development of workable digital rights management (DRM) technologies and therefore the primary contributor to the inevitability of copyright infringement.
Journalist J.D. Lasica, in his 2005 book “Darknet: Hollywood’s War Against the Digital Generation,” describes the darknet’s reach as encompassing file sharing networks.
As of 2015, “The Darknet” is often used interchangeably with “The Dark Web” due to the quantity of hidden services on Tor’s darknet. In addition, the term is often confusingly used interchangeably with the “The Deep Web” due to how Tor historically could not be search indexed. Mixing the use of these terms has been described as inaccurate, with some commentators recommending the terms be used in distinct fashions.
Darknets in general may be used for various reasons, such as:
* To better protect privacy from mass surveillance and targeted surveillance
* To protect dissidents from political reprisal
* For whistleblowing and news leaks
* For computer crime
* To exchange restricted goods on darknet markets
* For file sharing
Darknets require specific software to be installed or network configurations to be made in order to access them, such as Tor which can be accessed via a customised browser from Vidalia, a.k.a. the Tor browser bundle, or alternatively via a proxy server configured to perform the same function.
This article is about darknet websites. For the part of the Internet not accessible by traditional search engines, see Deep web (search).
The Dark Web, also confusingly referred to as the Deep Web and conflated with Deep Web search, is the World Wide Web content that exists on darknet, networks which overlay the public Internet and require specific software, configurations, or authorization to access. The Dark Web forms part of the Deep Web, the part of the Web not indexed by search engines. The darknets which constitute the Dark Web include small, friend-to-friend peer-to-peer networks, as well as large, popular networks like Freenet, I2P, and Tor, operated by public organizations and individuals. Users of the Dark Web refer to the regular web as the Clearnet due to its unencrypted nature. The Tor dark web may be referred to as Onionland, a reference to the network’s name as the “onion router.”
A December 2014 study by Gareth Owen from the University of Portsmouth found that the most commonly requested type of content on Tor was child pornography, followed by black markets, while the individual sites with the highest traffic were dedicated to botnet operations. Many whistleblowing sites maintain a presence, as well as political discussion forums. Cloned websites and other scam sites are numerous. Many hackers sell their services there individually or as a part of groups. Sites associated with Bitcoin, fraud related services, and mail order services are some of the most prolific.
1.1 Darknet markets
Commercial darknet markets, which mediate transactions for illegal drugs and other goods, attracted significant media coverage starting with the popularity of Silk Road and its subsequent seizure by legal authorities. Other markets sell software exploits and weapons.
There are reports of crowdfunded assassinations and hitmen for hire.
1.3 Snuff films
There is an urban legend that live murder can be found on the Dark Web. The term “Red Room” has been coined based on the Japanese animation of the same name. However, the evidence points towards any reported instances being hoaxes.
1.4 Child pornography
There is regular law enforcement action against sites distributing child pornography – often via compromising the site by distributing malware to the users. Sites use complex systems of guides, forums, and community regulation.
Although much of the Dark Web is innocuous, some prosecutors and government agencies, among others, are concerned that it is a haven for criminal activity. In his 2014 book “The Dark Net,” journalist Jamie Bartlett uses the terms “the dark net” and “dark web” to describe a range of underground and emergent sub cultures, including social media racists, cam girls, self harm communities, darknet drug markets, cryptoanarchists, and transhumanists.
Specialist news sites such as DeepDotWeb and All Things Vice provide news coverage and practical information about Dark Web sites and services. The Hidden Wiki and its mirrors and forks hold some of the largest directories of content at any given time.
Popular sources of Dark Web .onion links include Pastebin, YouTube, Twitter, Reddit, and other Internet forums.
In August 2015, it was announced that Interpol now offers a dedicated Dark Web training program featuring technical information on Tor, cybersecurity, and simulated darknet market takedowns.
A darknet market or cryptomarket is a commercial website on the Dark Web, operating on top of darknets such as Tor or I2P. Most function as black markets, selling or brokering transactions involving drugs, cyber-arms, weapons, counterfeit currency, stolen credit card details, forged documents, unlicensed pharmaceuticals, and other illicit goods, as well as legal products.
Contemporary markets are characterised by their use of darknet (typically Tor) anonymised access, Bitcoin payment and escrow services, and eBay-like vendor feedback systems.
1.1 1970s to 2011
Though e-commerce on the Dark Web only started around 2006, illicit goods were among the first items to be transacted using the Internet, when in the early 1970s students at Stanford University and the Massachusetts Institute of Technology used what was then called the ARPANET to coordinate the purchase of cannabis. By the end of the 1980s, newsgroups like alt.drugs became centers of drug discussion and information; however, any deals were arranged entirely off-site directly between individuals. With the development and popularization of the World Wide Web and e-commerce in the 1990s, the tools to discuss or conduct illicit transactions became more widely available. One of the better-known web-based drug markets, The Hive, launched in 1997 to serve as an information sharing forum for practical drug synthesis and legal discussion. The Hive was featured in a Dateline NBC special called “The ‘X’ Files” in 2001, bringing the subject into public discourse.
Since the year 2000 until the present day, some of the emerging cyber-arms industry operates online, including the Eastern European “Cyber-arms Bazaar,” trafficking in the most powerful crimeware and hacking tools. In the 2000s, early cybercrime and carding forums such as ShadowCrew experimented with drug wholesaling on a limited scale.
The Farmer’s Market was launched in 2006 and moved onto Tor in 2010. It has been considered a “proto-Silk Road,” but the use of payment services such as PayPal and Western Union allowed law enforcement to trace payments and it was subsequently shut down and several operators and users arrested in April 2012 as a result of Operation Adam Bomb, a two-year investigation led by the U.S. Drug Enforcement Administration (DEA).
1.2 Silk Road and early markets
The first pioneering marketplace to use both Tor and Bitcoin escrow was Silk Road, founded by Ross Ulbricht under the pseudonym “Dread Pirate Roberts” in February 2011. In June 2011, Gawker published an article about the site, which led to “Internet buzz” and an increase in website traffic. This in turn led to political pressure from United States Senator Chuck Schumer on the U.S. DEA and Department of Justice to shut down Silk Road, which they finally did in October 2013, following a lengthy investigation. Silk Road’s use of Tor, Bitcoin escrow, and feedback systems would set the standard for new darknet markets for the coming years. The shutdown of Silk Road was described by the news website DeepDotWeb as “the best advertising the dark net markets could have hoped for” following the proliferation of competing sites that the shutdown caused. The Guardian predicted that other sites would take over the market that Silk Road had previously dominated.
The months and years following Silk Road’s closure would be marked by a greatly increased number of shorter-lived markets as well as semi-regular law enforcement takedowns, hacks, scams, and voluntary closures.
Atlantis, the first site to accept Litecoin as well as Bitcoin, closed in September 2013, just prior to the Silk Road raid, allowing users just one week to withdraw any coins. In October 2013, Project Black Flag closed and stole their users’ bitcoins in the panic following Silk Road’s shutdown. Black Market Reloaded’s popularity increased dramatically after the closure of Silk Road; however, in late November 2013, the owner of Black Market Reloaded announced that the website would be taken offline due to the unmanageable influx of new customers. Sheep Marketplace, which launched in March 2013, was one of the lesser-known sites to gain popularity with Silk Road’s closure; it ceased operation in December 2013, when it announced it was shutting down after a vendor stole $6 million worth of users’ bitcoins.
1.3 Post-Silk Road to present
From late 2013 through 2014, new markets started launching with regularity, such as Silk Road 2.0, run by the former Silk Road site administrators, and the Agora marketplace. Such launches were not always successful; in February 2014, the highly anticipated market Utopia opened only to shut down eight days later following rapid actions by Dutch law enforcement.
November 2014 briefly shook the darknet market ecosystem when Operation Onymous, executed by the FBI and the U.K.’s National Crime Agency, led to the seizure of 27 hidden sites, including Silk Road 2.0 as well as smaller markets and individual vendor sites. Agora escaped Operation Onymous and as of April 2015 had gone on to become the largest overall marketplace with more listings than Silk Road at its height.
In March 2015, the Evolution marketplace performed an “exit scam,” stealing escrowed bitcoins worth $12 million, half of the ecosystem’s listing market share at that time. The closure of Evolution led to users redistributing to Black Bank and Agora. Black Bank, which in April 2015 captured 5% of darknet market listings, announced on May 18, 2015 its closure for “maintenance” before disappearing in a similar scam. Following these events, commentators suggested that decentralized marketplaces could be required, such as the service OpenBazaar, in order to protect buyers and vendors from risk, as well as more widespread support from “multi-sig” cryptocurrency payments.
In April 2015, TheRealDeal, the first open cyber-arms market for software exploits as well as drugs, launched to the interest of computer security experts. In May 2015, various distributed denial-of-service (DDoS) attacks were performed against different markets including TheRealDeal. The market owners set up a phishing website to get the attacker’s password, and subsequently revealed collaboration between the attacker and the administrator of Mr Nice Guy’s market, who was also planning to scam his users; this information was revealed to the news website DeepDotWeb.
On July 31, 2015 the Italian police, in conjunction with Europol, shut down the Italian-language darknet market Babylon, seizing 11,000 Bitcoin wallet addresses and 1 million euros.
2. Market features
2.1 Search and discussion
One of the central discussion forums is Reddit’s /r/DarkNetMarkets/, which has been the subject of legal investigation, as well as the Tor-based discussion forum, The Hub. Many marketplaces maintain their own dedicated discussion forums and subreddits.
The dedicated market search engine Grams allows the searching of multiple markets directly without login or registration.
Dark Web news and review sites such as DeepDotWeb and All Things Vice provide exclusive interviews and commentary on the dynamic markets. Uptime and comparison services such as DNStats provide sources of information about active markets as well as suspected scams and law enforcement activity. Due to the decentralized nature of these markets, phishing and scam sites are often maliciously or accidentally referenced.
After discovering the location of a market, a user must register on the site, sometimes with a referral link, after which they can browse listings. A further PIN may be required to perform transactions, better protecting users against login credential compromise.
2.2 Payments and infrastructure
Transactions typically use Bitcoin for payment, sometimes combined with cryptocurrency tumblers for added anonymity and Pretty Good Privacy (PGP) data encryption to secure communications between buyers and vendors from being stored on the site itself. Many sites use Bitcoin multisig transactions to improve security and reduce dependency on the site’s escrow. The Helix bitcoin tumbler offers direct anonymized marketplace payment integrations.
On making a purchase, the buyer must transfer cryptocurrency into the site’s escrow, after which a vendor dispatches their goods then claims the payment from the site. On receipt or non-receipt of the item, users may leave feedback against the vendor’s account. Buyers may “finalize early” (FE), releasing funds from escrow to the vendor prior to receiving their goods in order to expedite a transaction, but leave themselves vulnerable to fraud if they do so.
Web design, security, and hosting may be provided by Tor development and hosting specialist companies.
2.3 Market types
Items on a typical centralized darknet market are listed from a range of vendors in an eBay-like marketplace format. Virtually all such markets have advanced reputation, search, and shipping features similar to Amazon.com.
Some vendors have opened dedicated online shops separate from the large marketplaces. Individual sites have even taken to operating on the clearnet, with mixed success.
Some Internet forums, such as as the defunct Tor Carding Forum and the Russian Anonymous Marketplace, function as markets with trusted members providing escrow services and users engaging in off-forum messaging.
Following repeated failures of centralized infrastructure, a number of decentralized marketplace software alternatives have arisen using block chain distributed database technology, including OpenBazaar, Shadow, BitBay, Bitmarkets, and Nxt.
To list on a market, a vendor may have undergone an application process via referral, proof of reputation from another market, or provided a cash deposit to the market.
Many vendors list their wares on multiple markets, ensuring they retain their reputation should a single marketplace close. Grams launched the “InfoDesk” service to allow central content and identity management for vendors.
Meanwhile, individual law enforcement operations regularly investigate and arrest individual vendors.
While a great many products are sold, drugs dominate the numbers of listings. Due to increased law enforcement attention, many markets refuse to list weapons or poisons. The original Silk Road refused to list anything for which the “purpose is to harm or defraud, such as stolen credit cards, assassinations, and weapons of mass destruction.”
Later markets, such as Evolution, would ban “child pornography, services related to murder/assassination/terrorism, prostitution, ponzi schemes, and lotteries,” but allow the wholesaling of credit card data. Markets such as AlphaBay Market would go on to host a significant share of the commercial fraud market, featuring carding, counterfeiting, and many related services.
In December 2014, “The Darknet: From Memes to Onionland” explored darknet culture. The exhibition featured the “Random Darknet Shopper,” which spent $100 in bitcoin per week on Agora. The aim was to explore the ethical and philosophical implications of darknet markets, which, despite high-profile, internationally co-ordinated raids, persist and flourish.
James Martin’s 2014 book “Drugs on the Dark Net: How Cryptomarkets are Transforming the Global Trade in Illicit Drugs” discusses how some vendors are even branding their opium or cocaine as “fair trade,” “organic,” or sourced from conflict-free zones.
According to 2014 studies by Martin, Aldridge and Décary-Hétu, and a January 2015 report from the Global Drug Policy Observatory, many harm reduction trends have been spotted. These include a reduction of the risks associated with street dealing. The vendor feedback system provides accountability for risks of mixing and side effects, and protection against scammers. Online forum communities provide information about safe drug use in an environment where users can anonymously ask questions. Some users report that using the online marketplaces has a moderating effect on their drug consumption due to the increased lead time in receiving the drugs as compared to street dealing.
Europol reported in December 2014: “We have lately seen a large amount of physical crime move online, at least the ‘marketing’ and delivery part of the business … [Buyers can] get the illegal commodity delivered risk-free to a place of their choice by the mailman or a courier, or maybe by drone in the future, and can pay with virtual currency and in full anonymity, without the police being able to identify either the buyer or the seller.”
In June 2015, the European Monitoring Centre for Drugs and Drug Addiction produced a report citing difficulties controlling virtual marketplaces via darknet markets, social media, and mobile apps.