In this work, a novel framework for detecting mali-cious networks in the IoT-enabled Metaverse networks to ensure that malicious network traffic is identified and integrated to suit optimal Metaverse cybersecurity is presented. First, the study raises a core security issue related to the cyberthreats in Metaverse networks and its privacy breaching risks. Second, to address the shortcomings of efficient and effective network intrusion detection (NIDS) of dark web traffic, this study employs a quantization-aware trained (QAT) 1D CNN followed by fully con-nected networks (ID CNNs-GRU-FCN) model, which addresses the issues of and memory contingencies in Metaverse NIDS models. The QAT model is made interpretable using eXplainable artificial intelligence (XAI) methods namely, SHapley additive exPlanations (SHAP) and local interpretable model-agnostic ex-planations (LIME), to provide trustworthy model transparency and interpretability. Overall, the proposed method contributes to storage benefits four times higher than the original model without quantization while attaining a high accuracy of 99.82 \%.
Authored by Ebuka Nkoro, Cosmas Nwakanma, Jae-Min Lee, Dong-Seong Kim
The Internet as a whole is a large network of interconnected computer networks and their supporting infrastructure which is divided into 3 parts. The web is a list of websites that can be accessed using search engines like Google, Firefox, and others, this is called as Surface Web. The Internet’s layers stretch well beyond the surface material that many people can quickly reach in their everyday searches. The Deep Web material, which cannot be indexed by regular search engines like Google, is a subset of the internet. The Dark Web, which extends to the deepest reaches of the Deep Web, contains data that has been purposefully hidden. Tor may be used to access the dark web. Tor employs a network of volunteer devices to route users web traffic via a succession of other users computers, making it impossible to track it back to the source. We will analyze and include results about the Dark Web’s presence in various spheres of society in this paper. Further we take dive into about the Tor metrics how the relay list is revised after users are determined based on client requests for directories (using TOR metrics). Other way we can estimate the number of users in anonymous networks. This analysis discusses the purposes for which it is frequently used, with a focus on cybercrime, as well as how law enforcement plays the adversary position. The analysis discusses these secret Dark Web markets, what services they provide, and the events that take place there such as cybercrime, illegal money transfers, sensitive communication etc. Before knowing anything about Dark Web, how a rookie can make mistake of letting any threat or malware into his system. This problem can be tackled by knowing whether to use Windows, or any other OS, or any other service like VPN to enter Dark world. The paper also goes into the agenda of how much of illegal community is involved from India in these markets and what impact does COVID-19 had on Dark Web markets. Our analysis is carried out by searching scholarly journal databases for current literature. By acting as a reference guide and presenting a research agenda, it contributes to the field of the dark web in an efficient way. This paper is totally built for study purposes and precautionary measures for accessing Dark Web.
Authored by Hardik Gulati, Aman Saxena, Neerav Pawar, Poonam Tanwar, Shweta Sharma
When we setup a computer network, we need to know if an attacker can get into the system. We need to do a series of test that shows the vulnerabilities of the network setup. These series of tests are commonly known Penetration Test. The need for penetration testing was not well known before. This paper highlights how penetration started and how it became as popular as it has today. The internet played a big part into the push to getting the idea of penetration testing started. The styles of penetration testing can vary from physical to network or virtual based testing which either can be a benefit to how a company becomes more secure. This paper presents the steps of penetration testing that a company or organization needs to carry out, to find out their own security flaws.
Authored by Devin Sweigert, Md Chowdhury, Nafiz Rifat
Researchers have investigated the dark web for various purposes and with various approaches. Most of the dark web data investigation focused on analysing text collected from HTML pages of websites hosted on the dark web. In addition, researchers have documented work on dark web image data analysis for a specific domain, such as identifying and analyzing Child Sexual Abusive Material (CSAM) on the dark web. However, image data from dark web marketplace postings and forums could also be helpful in forensic analysis of the dark web investigation.The presented work attempts to conduct image classification on classes other than CSAM. Nevertheless, manually scanning thousands of websites from the dark web for visual evidence of criminal activity is time and resource intensive. Therefore, the proposed work presented the use of quantum computing to classify the images using a Quantum Convolutional Neural Network (QCNN). Authors classified dark web images into four categories alcohol, drugs, devices, and cards. The provided dataset used for work discussed in the paper consists of around 1242 images. The image dataset combines an open source dataset and data collected by authors. The paper discussed the implementation of QCNN and offered related performance measures.
Authored by Ashwini Dalvi, Soham Bhoir, Irfan Siddavatam, S Bhirud
Cyber threat intelligence (CTI) is vital for enabling effective cybersecurity decisions by providing timely, relevant, and actionable information about emerging threats. Monitoring the dark web to generate CTI is one of the upcoming trends in cybersecurity. As a result, developing CTI capabilities with the dark web investigation is a significant focus for cybersecurity companies like Deepwatch, DarkOwl, SixGill, ThreatConnect, CyLance, ZeroFox, and many others. In addition, the dark web marketplace (DWM) monitoring tools are of much interest to law enforcement agencies (LEAs). The fact that darknet market participants operate anonymously and online transactions are pseudo-anonymous makes it challenging to identify and investigate them. Therefore, keeping up with the DWMs poses significant challenges for LEAs today. Nevertheless, the offerings on the DWM give insights into the dark web economy to LEAs. The present work is one such attempt to describe and analyze dark web market data collected for CTI using a dark web crawler. After processing and labeling, authors have 53 DWMs with their product listings and pricing.
Authored by Ashwini Dalvi, Gunjan Patil, S Bhirud
The value and size of information exchanged through dark-web pages are remarkable. Recently Many researches showed values and interests in using machine-learning methods to extract security-related useful knowledge from those dark-web pages. In this scope, our goals in this research focus on evaluating best prediction models while analyzing traffic level data coming from the dark web. Results and analysis showed that feature selection played an important role when trying to identify the best models. Sometimes the right combination of features would increase the model’s accuracy. For some feature set and classifier combinations, the Src Port and Dst Port both proved to be important features. When available, they were always selected over most other features. When absent, it resulted in many other features being selected to compensate for the information they provided. The Protocol feature was never selected as a feature, regardless of whether Src Port and Dst Port were available.
Authored by Ahmad Al-Omari, Andrew Allhusen, Abdullah Wahbeh, Mohammad Al-Ramahi, Izzat Alsmadi
Web evolution and Web 2.0 social media tools facilitate communication and support the online economy. On the other hand, these tools are actively used by extremist, terrorist and criminal groups. These malicious groups use these new communication channels, such as forums, blogs and social networks, to spread their ideologies, recruit new members, market their malicious goods and raise their funds. They rely on anonymous communication methods that are provided by the new Web. This malicious part of the web is called the “dark web”. Dark web analysis became an active research area in the last few decades, and multiple research studies were conducted in order to understand our enemy and plan for counteract. We have conducted a systematic literature review to identify the state-of-art and open research areas in dark web analysis. We have filtered the available research papers in order to obtain the most relevant work. This filtration yielded 28 studies out of 370. Our systematic review is based on four main factors: the research trends used to analyze dark web, the employed analysis techniques, the analyzed artifacts, and the accuracy and confidence of the available work. Our review results have shown that most of the dark web research relies on content analysis. Also, the results have shown that forum threads are the most analyzed artifacts. Also, the most significant observation is the lack of applying any accuracy metrics or validation techniques by most of the relevant studies. As a result, researchers are advised to consider using acceptance metrics and validation techniques in their future work in order to guarantee the confidence of their study results. In addition, our review has identified some open research areas in dark web analysis which can be considered for future research work.
Authored by Tamer Abdellatif, Raed Said, Taher Ghazal
Currently, the Dark Web is one key platform for the online trading of illegal products and services. Analysing the .onion sites hosting marketplaces is of interest for law enforcement and security researchers. This paper presents a study on 123k listings obtained from 6 different Dark Web markets. While most of current works leverage existing datasets, these are outdated and might not contain new products, e.g., those related to the 2020 COVID pandemic. Thus, we build a custom focused crawler to collect the data. Being able to conduct analyses on current data is of considerable importance as these marketplaces continue to change and grow, both in terms of products offered and users. Also, there are several anti-crawling mechanisms being improved, making this task more difficult and, consequently, reducing the amount of data obtained in recent years on these marketplaces. We conduct a data analysis evaluating multiple characteristics regarding the products, sellers, and markets. These characteristics include, among others, the number of sales, existing categories in the markets, the origin of the products and the sellers. Our study sheds light on the products and services being offered in these markets nowadays. Moreover, we have conducted a case study on one particular productive and dynamic drug market, i.e., Cannazon. Our initial goal was to understand its evolution over time, analyzing the variation of products in stock and their price longitudinally. We realized, though, that during the period of study the market suffered a DDoS attack which damaged its reputation and affected users' trust on it, which was a potential reason which lead to the subsequent closure of the market by its operators. Consequently, our study provides insights regarding the last days of operation of such a productive market, and showcases the effectiveness of a potential intervention approach by means of disrupting the service and fostering mistrust.
Authored by Víctor Labrador, Sergio Pastrana
Internet technology has made surveillance widespread and access to resources at greater ease than ever before. This implied boon has countless advantages. It however makes protecting privacy more challenging for the greater masses, and for the few hacktivists, supplies anonymity. The ever-increasing frequency and scale of cyber-attacks has not only crippled private organizations but has also left Law Enforcement Agencies(LEA's) in a fix: as data depicts a surge in cases relating to cyber-bullying, ransomware attacks; and the force not having adequate manpower to tackle such cases on a more microscopic level. The need is for a tool, an automated assistant which will help the security officers cut down precious time needed in the very first phase of information gathering: reconnaissance. Confronting the surface web along with the deep and dark web is not only a tedious job but which requires documenting the digital footprint of the perpetrator and identifying any Indicators of Compromise(IOC's). TORSION which automates web reconnaissance using the Open Source Intelligence paradigm, extracts the metadata from popular indexed social sites and un-indexed dark web onion sites, provided it has some relating Intel on the target. TORSION's workflow allows account matching from various top indexed sites, generating a dossier on the target, and exporting the collected metadata to a PDF file which can later be referenced.
Authored by Hritesh Sonawane, Sanika Deshmukh, Vinay Joy, Dhanashree Hadsul
Onion Routing is an encrypted communication system developed by the U.S. Naval Laboratory that uses existing Internet equipment to communicate anonymously. Miscreants use this means to conduct illegal transactions in the dark web, posing a security risk to citizens and the country. For this means of anonymous communication, website fingerprinting methods have been used in existing studies. These methods often have high overhead and need to run on devices with high performance, which makes the method inflexible. In this paper, we propose a lightweight method to address the high overhead problem that deep learning website fingerprinting methods generally have, so that the method can be applied on common devices while also ensuring accuracy to a certain extent. The proposed method refers to the structure of Inception net, divides the original larger convolutional kernels into smaller ones, and uses group convolution to reduce the website fingerprinting and computation to a certain extent without causing too much negative impact on the accuracy. The method was experimented on the data set collected by Rimmer et al. to ensure the effectiveness.
Authored by Dingyang Liang, Jianing Sun, Yizhi Zhang, Jun Yan
Deep web refers to sites that cannot be found by search engines and makes up the 96% of the digital world. The dark web is the part of the deep web that can only be accessed through specialised tools and anonymity networks. To avoid monitoring and control, communities that seek for anonymization are moving to the dark web. In this work, we scrape five dark web forums and construct five graphs to model user connections. These networks are then studied and compared using data mining techniques and social network analysis tools; for each community we identify the key actors, we study the social connections and interactions, we observe the small world effect, and we highlight the type of discussions among the users. Our results indicate that only a small subset of users are influential, while the rapid dissemination of information and resources between users may affect behaviours and formulate ideas for future members.
Authored by Sotirios Nikoletos, Paraskevi Raftopoulou
The amount of information that is shared regularly has increased as a direct result of the rapid development of network administrators, Web of Things-related devices, and online users. Cybercriminals constantly work to gain access to the data that is stored and transferred online in order to accomplish their objectives, whether those objectives are to sell the data on the dark web or to commit another type of crime. After conducting a thorough writing analysis of the causes and problems that arise with wireless networks’ security and privacy, it was discovered that there are a number of factors that can make the networks unpredictable, particularly those that revolve around cybercriminals’ evolving skills and the lack of significant bodies’ efforts to combat them. It was observed. Wireless networks have a built-in security flaw that renders them more defenceless against attack than their wired counterparts. Additionally, problems arise in networks with hub mobility and dynamic network geography. Additionally, inconsistent availability poses unanticipated problems, whether it is accomplished through mobility or by sporadic hub slumber. In addition, it is difficult, if not impossible, to implement recently developed security measures due to the limited resources of individual hubs. Large-scale problems that arise in relation to wireless networks and flexible processing are examined by the Wireless Correspondence Network Security and Privacy research project. A few aspects of security that are taken into consideration include confirmation, access control and approval, non-disavowal, privacy and secrecy, respectability, and inspection. Any good or service should be able to protect a client’s personal information. an approach that emphasises quality, implements strategy, and uses a poll as a research tool for IT and public sector employees. This strategy reflects a higher level of precision in IT faculties.
Authored by Hoshiyar Singh, K Balamurgan