The Ultimate Performance Challenge
Optimize your code to process one billion rows in record time. Compete with developers worldwide and claim your spot on the leaderboard.
def process_billion_rows():
start_time = time.time()
# Your optimized code here
return time.time() - start_time
Global Leaderboard
Rank
User
Time
Language
Get Started
Download Test Dataset
Start developing with our 1M row test dataset. Perfect for local development and testing your algorithms.
1M
Rows
13MB
Size
302
Stations
💡 Use this dataset for local development. Final testing uses the full billion-row dataset.
Sample Data
Chengdu=9.1
Berlin=-9.3
Krakow=15.9
Tampa=27.6
...
Submit Your Solution
Ready to compete?
Submit your optimized solution through GitHub and see how you rank against the world's best developers.
1
Fork the repository
2
Add your solution
3
Create a PR
📁 Fork
→
💻 Code
→
🚀 PR