Blockchain Engineering projects - class of 2018 #3909
Project : Bitcoin-Intelligent-Life
Self-replicating Bitcoin-based entities using deep re-enforcement learning
You will create a key prototype that advances the state-of-the-art of self-replicating software, autonomy, and artificial intelligence. Your mission is not to terminate all human life.
Bitcoin is an essential technology for digital life: you can buy servers. In prior work TUDelft has created CloudOmate. With CloudOmate you can buy servers with Bitcoin automatically. CloudOmate gives any computer the power to own money, replicate freely, no human can access that server, and no human can take that money away. It becomes an autonomous virtual entity. With cloudomate can even enhance it's own privacy and buy VPN networking autonomously.
Next step is adding intelligence and learning to this autonomous virtual entity. You will combine buying servers with Bitcoins with the another peak-hype technology of today: AI. You will create a basic intelligent entity, capable of self-replication and learning capability. For the learning side you can use deep re-enforcement learning engine in Python. See basic tutorial
Possible sprints for the 10 weeks:
Outcome: running code and Github readme.md (no thick final report that nobody really reads).
Warning: this is a scientifically most challenging assignment (recommended for 'Cum Laude' level students)
The text was updated successfully, but these errors were encountered:
Project: Measuring the use of Tor in Zcash
Motivation: Zcash is a privacy-oriented cryptocurrency, which focusses on hiding the identities of the people involved in transactions. However, Zcash requires the use of anonymity networks such as Tor to guarantee anonymity. In this study, you are supposed to figure out if users indeed make use of Tor when participating in Zcash. Furthermore, you should measure how the use of Tor affects the performance of the systems, answering questions such as i) Do Tor users receive blocks later?, ii) Are they less connected (, which could potentially increase their vulnerabilities to attacks)?
Project: Measuring the use of Tor in Monero
Motivation: Monero is a privacy-oriented cryptocurrency, which focusses on hiding the identities of the people involved in transactions. However, Monero requires the use of anonymity networks such as Tor to guarantee anonymity. In this study, you are supposed to figure out if users indeed make use of Tor when participating in Monero. Furthermore, you should measure how the use of Tor affects the performance of the systems, answering questions such as i) Do Tor users receive blocks later?, ii) Are they less connected (, which could potentially increase their vulnerabilities to attacks)?
Steps for both projects above.
To hand in:
Project: Implement Throughput Attack on Tor using Shadow config files
Anonymity when using cryptocurrencies requires the use of anonymity networks such as Tor. However, Tor is susceptible to a number of attacks. One particular problematic attack is the throughput attack, which re-identifies by correlating throughput experienced by different users. Despite its severity, the impact of the throughput attack on the current Tor (rather than Tor in 2011, when the attack was first discovered) is unclear. In this project, you will re-implement the attack in Shadow, a simulator for Tor, and test it on simulated networks. Your implemented should primarily rely on Shadow config files, so that it can work with any version of Tor. During the project, you will learn how to use network simulation to assess the security of an anonymity network. You will further gain some insights into anonymity networks and attacks on anonymity.
To hand in:
Here at Booking.com, we envision a blockchain-based solution where municipalities, apartment owners, and various OTAs interact with each other, addressing the above-mentioned issues.
Supervised by: Manager Software Development, Team-lead in Development, and Data Scientist from Booking.com.
Towards extreme smart contract parallelism
Smart contracts are a new execution model for applications that run on top of a blockchain. They allow developers to write business logic that enforces agreements between two or more parties, without a trusted intermediary. Smart contracts can reason about money and enable automated value transfers. Currently, there are several thousands of contracts deployed on the Ethereum blockchain, some of which manage assets worth billions of dollars.
Despite all hype, traditional blockchain fabrics usually have poor transaction throughput. The performance of the Ethereum blockchain is theoretically limited to around 23 transactions per second which is by far not enough to host all financial applications in the world. In fact, some applications like Cryptokitties have been popular enough to slow down the entire network and increase transaction costs to disproportional values. This motivates the search for solutions with higher throughput and performance. For example, layer one solutions (sharding) or layer two solutions (state channels/plasma/Truebit) have been proposed by both academia and industry.
A more fundamental approach to improve the scalability of blockchains is to rely on different organisations of the distributed ledger. For example, the IOTA network maintains a Directed Acyclic Graph (DAG) where each transaction contains hash pointers to exactly two prior transactions. Another class of distributed ledgers are “pairwise” ledgers, where each participant grows and maintains their own tamper-proof chain of transactions. This type of distributed ledger is used by TrustChain, designed and implemented by Delft University of Technology. TrustChain has been Internet-deployed and contains over 1.3 million blocks, created by almost 3.000 unique users (see http://explorer.tribler.org). By relying on fraud detection instead of fraud prevention, TrustChain achieves superior transaction throughput compared to Bitcoin and Ethereum.
The goal of this project is to explore how TrustChain, or pairwise distributed ledgers in general, can be used to execute (very simple) smart contracts. Possible work flow:
This is an exploratory project which means that there are open challenges and questions. You are not expected to build a completely secure system, but you should be able to explain the decisions bein made. Students should gain familiarity with TrustChain and the underlying networking library in the first few weeks of the project (see https://github.com/tribler/py-ipv8). Prior knowledge of Ethereum, smart contracts and/or state channels is recommended.
Supervised by me and @mitchellolsthoorn
Project Z-1 Secure and Privacy Preserving Decentralized Information Sharing System
The responsibility of Centraal Justitieel Incassobureau (CJIB) is to make sure fines are collected. This task is given to CJIB by the Ministry of Justice and Safety. CJIB therefore plays an important and central role around collecting fines, which requires transparency, auditability and trust.
One of the challenges CJIB is facing is the case where a citizen cannot pay his/her fine due to financial reason. This citizen’s financial situation is a privacy-sensitive information that is a not accessible by CJIB. The only organization that might know the financial situation of a certain citizen is the municipality where he/she resides. Without knowing the financial situation of that citizen, the procedure followed by CJIB results in unpleasant consequences, including penalty to be paid and even a court case.
The scenario describe above can easily be resolved, should CJIB know the financial situation of that particular citizen. One straightforward approach to solve this problem is to share financial status information between CJIB and multiple municipalities. Unfortunately, there is no such system exists, other than receiving a letter from municipality in some cases.
The scenario will address this problem by designing a secure and privacy-preserving decentralized information sharing system.
Supervisors: Zeki Erkin and CJIB
Project Z-2 Decentralized real estate information sharing market
A specific characteristic of a property (size, price or location etc.) Data point: All information regarding one historical property transaction
The valuing of real estate is done to determine , amongst other things, the listing price, tax value of a property or value as a collateral for a loan . A valuation is performed by a company who uses market data (historical transaction data) to determine the value of a property. This data is used as input for a statistical analysis tool, which determines the value of a property. For an accurate valuation the valuer requires sufficient historical transactions in a large data set, containing comparable properties to the one being valued at hand.
A brief list of information that is stored in a data set to determine the value of a property:
Future market (Decentralized information sharing platform):
To overcome the shortcomings of the current market, a digital system can be created on which data points can be shared amongst companies. Thereby reducing the chance on double spending , reducing bias from using one data set and limiting the potential of information asymmetry. To make this platform interesting requires that companies can download similar quality data in relationship to what they upload. Thereby facilitating that companies provide data points, of equal quality, to each other.
A company possessing data can upload it onto the platform. The data set is subdivided into data points with their respective pieces of information (Location, Floor area, Construction date, Transaction price, Transaction date and Property type). When a data set is uploaded, the system will run an algorithm to check for each data point if a property with given location and property type already exists in the system.
Each data point that is accepted into the system will receive an invisible identification, linking it to the “owner/uploader”. Thereby assigning ownership of this data point to the uploader. The quality of individual data points is determined on the basis of an algorithm. With the help of this quality identification, the uploader receives credits on the platform. When the quality of data point is higher, the uploader will receive more credits. With these credits the uploader is able to buy data points from other owners, for which a credit amount respective to the quality of the data point will be subtracted. Thereby making it necessary for participants on the platform to share on the platform, in order for them to be able to use data points uploaded by others.
When a valuer is requested to perform a valuation, he will first determine if he has adequate data points available for a accurate valuation . This can be done by reviewing if the accessible data points are sufficiently comparable to the property at hand. After that the valuer can add data points from other uploaders to reduce data set bias. If the valuer does not have access to sufficient amounts of data, he will be required to buy new data points from another uploader.
Supervision: Zeki Erkin, Ilir Nase and Pim van Doleweerd
Project Z-3 Know Your Customer
When applying for a loan, the applicant has to prove his/her creditworthiness.
To determine someone's credit score, there is currently a cumbersome process whereby an intermediary or advisor needs 4 to 6 weeks to determine the credit score. First of all, a telephone conversation is held with the intermediary. In that conversation, they discuss which documents must be submitted (for example, payslips, pension details, passport, bank statements, marriage certificate, etc.). Subsequently, another conversation takes place and, after several back-and-forths, a determination is made about the credit score. This cumbersome and time-consuming process is typical when applying for a mortgage, buying or renting a house or car, but is also more widely visible when determining your general credit score. Besides time-consuming, the current method is often unsafe. Credit scores are currently being monitored centrally, which in the US recently led to the theft of sensitive personal information of more than 140 million people. Due to recent events such as the unauthorized use of the data of 84 million users of Facebook or the Equifax data breach (theft of the credit score + personal data of consumers in the US), the trust of users in organizations that manage large amounts of data is heavily damaged.
Real-time, trusted, and safe insight into your credit score is currently not yet possible. Looking at developments in society, however, this is very desirable. The social importance of this project is, therefore, twofold. First, with the help of the smart contract principle, the obstacle to applying for funding will be largely removed. A more safe, simplified, efficient process leads to a more attractive way to check a credit score. The threshold is thus largely removed from citizens, which indirectly benefits the national economy. Secondly, the software will include an alert function, which warns against (potential) credit dangers. This way, citizens who are in the danger zone can be proactively guided and assisted in their financial problems. In this way, the citizens benefit from a smart contract solution. For companies, this is also a more efficient and less error-prone way of working and extending credit.
Lizard Apps sees a chance of winning a smart contracts platform using blockchain technology. Blockchain technology enables trust between two parties that do not know each other through means of encryption and decentralized storage of data. The platform that Lizard Apps wants to develop encrypts the data and stores them in a decentralized manner. In this way, the user decides for himself which data he provides and to whom. The platform to be developed, therefore, makes it impossible to view the data of citizens before they give permission for this via their DigiD.
This credit score software will consist of the following three main components (in addition to blockchain):
Step 1. Storage credit information (see figure 1): Parties encrypt the relevant contract information. The data is encrypted by means of their signature, e-recognition and DigiD respectively. This data is then stored in the chain.
Step 2. Credit score check (see figure 2): Third parties can retrieve the credit score from the user. This is only possible if the user gives permission for this by means of his DigiD.
Step 3. Checking credit obligations: The user can check all of his own different credit obligations. This can be done by the user when he is using his DigiD logs into an instance of the service.
Supervisor: Zeki Erkin. This project is a collaboration between LizardApps and Nationale Nederlanden.
Research project proposal PortCoin
By: BlockLab & Port of Rotterdam
Date: Feb 2018
One of the largest constraints in supply chain optimization projects is to facilitate and stimulate cooperation and information sharing between companies that have no contractual connection between them, but are ‘forced’ to work together due to their customers arrangements.
Example: deepsea container terminals handle barge, rail and road volumes. The shipper/receiver/forwarder (hereafter called inland operator) make operational appointments, called slots, for their modalities to be handled at the deepsea terminal. The deepsea terminal invoices its customer (shipping line), the barge operator invoices its customer (the inland operator). There is no contractual relationship between deepsea terminal and inland operator, yet they have to work together to arrange smooth transition between sea and inland transport. Due to the absence of a contract both the deepsea terminal and inland operator have no incentive to adhere to the planned slots. Furthermore, the slots are not interchangeable and terminal and barge do not have a common view on each others transaction. As a result allocated slots are lost effecting both terminal and barge operating efficiency.
Develop a working prototype that demonstrates how these type of appointments can be valued and traded on a (spot and/or future) market with the use of new technology, for example cryptocurrency, and how the Port Authority should facilitate this.
Establish a default/standard for such a market, including standardization of transaction data, pricing algorithms and taking into consideration privacy. Launch PortCoins and issue allowance to each major actor in the supply chain. Launch a trading platform (PortExchange) to facilitate the trading/exchange of agreements and of potential other cryptocurrencies that aim to optimize logistics.
Handling explosive blockchain growth
To handle the explosive influx of information, several solutions have been suggested. Bitcoin has the notion of lightweight and full nodes, in this model lightweight nodes do not store the full blockchain and are therefore capable of running on mobile devices. Ethereum introduced proof-of-stake, where you have only shards of the network's data. In TrustChain you only store the transactions you are a part of yourself. These solutions only mitigate the problem of database growth though. Given enough time, individual databases will still grow too large for disks to handle.
For this assignment, you will be given the TrustChain database as crawled from our wild users. You will be tasked with analyzing the workload of the database (insertion frequency, entry disk size, etc.) and creating a performance benchmark. Once you have your baseline established, you will be tasked with improving over this baseline and provide a solution (for example, by using a graph database) for the wild growth of data in real blockchain solutions.
Energy Data Marketplace:
When it comes to using, utilizing and processing date on a decentralized platform - there are many initiatives trying to solve these issues, via utilizing legal-frameworks for selling and using data combined with marketplaces built on properly decentralized protocols. A good example of such an initiative is the Ocean Protocol
The Open energy hub (OEHU) shares this vision for more open future data infrastructures, in which the role of the customer is replaced with that of the data-producer / prosumer – those who generate data and get compensated for doing so, financially or otherwise. OEHU focusses on energy data, and currently consists of smart electricity meters pushing data into a BigchainDB network via a Raspberry Pi attached to the meter’s P1 port. The data is currently open for anyone to access via the api and is in plaintext / JSON format.
This assignment comes in twofold.
We would like to see a future implementation of this concept – or an upgrade to the existing system - which enabled users to push data that was either anonymized or encrypted prior to upload to the BigchainDB network (OEHU platform). Data will be private or useless to outside users until permission to gain access to the allocated amount of data is granted by the owner, who is suitably compensated for doing so.
A data mandate system, which would allow the users to sell their data, thus needs to be implemented. Involved in this system – affecting whether the data was anonymized or simply encrypted prior to being uploaded – would be a way of making this data useful to the buyer again prior to transfer. This would mean that a buyer should see the data-headers that are available, but not the values itself.
supervisor: ... & OEHU / Blocklab