File size: 3,675 Bytes
fba3170 73ad354 fba3170 1ef3e5a fba3170 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
---
license: apache-2.0
task_categories:
- text-classification
- token-classification
- feature-extraction
language:
- en
tags:
- code
pretty_name: Balanced Ethereum Smart Contract
size_categories:
- 1B<n<10B
---
# Dataset Card for Balanced Ethereum Smart Contract
The rapid expansion of blockchain technology, particularly Ethereum, has driven widespread adoption of smart contracts. However, the security of these contracts remains a critical concern due to the increasing frequency and complexity of vulnerabilities. This paper presents a comprehensive approach to detecting vulnerabilities in Ethereum smart contracts using pre-trained Large Language Models (LLMs). We apply transformer-based LLMs, leveraging their ability to understand and analyze Solidity code to identify potential security flaws. Our methodology involves fine-tuning eight distinct pre-trained LLM models on curated datasets varying in types and distributions of vulnerabilities, including multi-class vulnerabilities. The datasets-SB Curate, Benmark Solidity Smart Contract, and ScrawID-were selected to ensure a thorough evaluation of model performance across different vulnerability types. We employed over-sampling techniques to address class imbalances, resulting in more reliable training outcomes. We extensively evaluate these models using precision, recall, accuracy, F1 score, and Receiver Operating Characteristics (ROC) curve metrics. Our results demonstrate that the transformer encoder architecture, with its multi-head attention and feed-forward mechanisms, effectively captures the nuances of smart contract vulnerabilities. The models show promising potential in enhancing the security and reliability of Ethereum smart contracts, offering a robust solution to challenges posed by software vulnerabilities in the blockchain ecosystem.
## Data Collection and Processing
This dataset is processed from three public datasets and we used them for our proposed method in [1].
These datasets are: A benchmark dataset of Solidity smart contracts, SB Curate, and ScrawlD.
## Citation [optional]
[1] T. -T. -H. Le, J. Kim, S. Lee and H. Kim, "Robust Vulnerability Detection in Solidity-Based Ethereum Smart Contracts Using Fine-Tuned Transformer Encoder Models," in IEEE Access, vol. 12, pp. 154700-154717, 2024, doi: 10.1109/ACCESS.2024.3482389. keywords: {Smart contracts;Codes;Transformers;Security;Solid modeling;Analytical models;Training;Encoding;Biological system modeling;Large language models;Ethereum smart contracts;large language models;multi-class imbalance;multi-class classification;smart contract vulnerability;solidity code},
**BibTeX:**
@ARTICLE{10720785,
author={Le, Thi-Thu-Huong and Kim, Jaehyun and Lee, Sangmyeong and Kim, Howon},
journal={IEEE Access},
title={Robust Vulnerability Detection in Solidity-Based Ethereum Smart Contracts Using Fine-Tuned Transformer Encoder Models},
year={2024},
volume={12},
number={},
pages={154700-154717},
keywords={Smart contracts;Codes;Transformers;Security;Solid modeling;Analytical models;Training;Encoding;Biological system modeling;Large language models;Ethereum smart contracts;large language models;multi-class imbalance;multi-class classification;smart contract vulnerability;solidity code},
doi={10.1109/ACCESS.2024.3482389}}
@misc {le_2025,
author = { {Le} },
title = { Balanced-Ethereum-Smart-Contract (Revision fba3170) },
year = 2025,
url = { https://huggingface.co/datasets/Thi-Thu-Huong/Balanced-Ethereum-Smart-Contract },
doi = { 10.57967/hf/4850 },
publisher = { Hugging Face }
}
## Dataset Card Contact
Email: [email protected] |