Good afternoon, everybody, there is a memory leak problem in the python parser, which leads to the fact that if you keep the script on for a very long time, it starts to eat all the memory. If anyone helps, I will be very grateful. Code below.

data=pd.DataFrame({'id':[1], 'last': [1], 'lowestAsk': [1], 'highestBid': [1], 'percentChange': [1], 'baseVolume': [1], 'quoteVolume': [1], 'isFrozen': [1], 'high24hr': [1], 'low24hr': [1], 'date': [1]}) while 1: try: url = 'https://poloniex.com/public?command=returnTicker' responce = requests.get(url, timeout=10).json()['USDT_BTC'] url = 'https://poloniex.com/public?command=returnTradeHistory&currencyPair=USDT_BTC' responce.update({'date':requests.get(url, timeout=10).json()[0]['date']}) if abs(float(responce['lowestAsk'])-float(data['lowestAsk'].tolist()[-1]))>1: data=data.append(responce,ignore_index=True) data=data.drop(data.index[[0]]) data.to_csv('pol1.csv',mode='a',header=False) del url,responce except: sys.stdout.write('pol тупит') 
  • And how is the process profiled? - Tihon

1 answer 1

Try this:

 url1 = 'https://poloniex.com/public?command=returnTicker' url2 = 'https://poloniex.com/public?command=returnTradeHistory&currencyPair=USDT_BTC' prev_lowestAsk = 0 while True: try: r = requests.get(url1, timeout=10).json()['USDT_BTC'] r.update({'date':requests.get(url2, timeout=10).json()[0]['date']}) if abs(r['lowestAsk'] - prev_lowestAsk > 1): pd.DataFrame(r, index=[0]).to_csv('c:/temp/pol1.csv', mode='a', header=False, index=False) prev_lowestAsk = r['lowestAsk'] except: sys.stdout.write('pol тупит') 
  • tried, the memory continues to flow away - Ivan
  • @ Ivan, then you have to "dig" - MaxU