Skip to content

Draft: Start doing #72 GoodBad ISPs page improvements

bauruine requested to merge bauruine/community:isp_list_json into main


This is my first draft for #72 (closed). I've moved the data to a json which makes it easier to get the cw fractions later. At the moment I'm using a hacky script to generate the json from the existing and also add the cw fraction. I'm not sure what the best way to integrate this is. Should I create a plugin that parses metrics every time the site is generated? Any other ideas?

import re
import json
import requests

resp = requests.get("",
onionoo_data = resp.json()

asn_cw = {}

for relay in onionoo_data['relays']:
    if relay["running"]:
        asn = relay["as"]
        if asn not in asn_cw:
            asn_cw[asn] = 0
        asn_cw[asn] += relay["consensus_weight_fraction"] * 100

content = open('', 'r').readlines()

json_content = {}
for line in content:
    line = line.replace('"', '\"')
    if re.match(r'###', line):
        country ='### (.*)', line).group(1)
    if re.match(r'\|', line):
        splited = line.split("|")
        if country not in json_content:
            json_content[country] = []
        split_name_url ='\[(.*?)\]\((.*?)\)', line)
        if split_name_url:
            asn = splited[2].strip()
            if asn in asn_cw:
                cw_fraction = f"{round(asn_cw[asn], 2)}%"
                cw_fraction = ""
                                          'asn': asn,
                                          'bridge': splited[3].strip(),
                                          'relay': splited[4].strip(),
                                          'exit': splited[5].strip(),
                                          'comment': splited[6].strip(),
                                          'last_updated': splited[7].strip(),
                                          'cw_fraction': cw_fraction})

Merge request reports