Donate to Remove ads

Got a credit card? use our Credit Card & Finance Calculators

Thanks to johnstevens77,Bhoddhisatva,scotia,Anonymous,Cornytiv34, for Donating to support the site

How to parse Json the easy way

Discussions regarding financial software
schober
Lemon Pip
Posts: 71
Joined: November 30th, 2017, 11:14 am
Has thanked: 1 time
Been thanked: 20 times

How to parse Json the easy way

#117454

Postby schober » February 11th, 2018, 8:01 pm

1) This command results in an unreadable json blob!
query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L

2) The solution; ............... install jq
https://stedolan.github.io/jq/

3) Now enter this code at the command line
curl 'query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L' | jq '.'

4) Or better save this code & make it executable; invoke with filename1 VOD.L at the command line

#!/bin/bash
qu="query1.finance.yahoo.com/v7/finance/quote?symbols=$1"
curl "$qu" | jq '.quoteResponse.result[]'

5) The code below makes the data "usable" by converting it to csv format. Save the code & make it executable; invoke it with filename2 VOD.L

6) If anyone uses this data source please let us know what you did and what happened.



##### Convert Json data to Csv and save to file. Invoke with filename PRU.L or filename MSFT etc ####
#!/bin/bash
#epic="PRU.L"
epic=$1
curl "query1.finance.yahoo.com/v7/finance/quote?symbols=$epic" > /root/my-applications/Yhoo/vod
echo "Getting data from Yahoo and saving to file"
######################### Get Json data from file & filter through jq #######################
IFS=$'\n'
i=0
ar=()
while IFS= read -r line; do
#printf '%s\n' "$line"
ar[$i]=$line
let i=i+1
done <<< "$(jq '.quoteResponse.result[0]' /root/my-applications/Yhoo/vod)"
Len=${#ar[@]}
unset ar[0] # Delete [ & ] at start and end of array
unset ar[$Len-1]

#for i in ${ar[@]};do echo "$i"; done
echo "Array filled and ready for processing into csv format"
################# Convert each line into csv format ####################
n=0
for ((i=0; i<=${Len}; i++)); do
ar[i]="$(echo "${ar[i]}" | tr -d '[:space:]')" # Delete spaces
ar[i]="$(echo "${ar[i]}" | tr -d '["]')" # Delete quotes
ar[i]="$(echo "${ar[i]}" | tr -d '[,]')" # Delete commas
ar[i]=${ar[i]/:/,} # Change colons into commas
#echo "${ar[i]}"
let n=n+1
done

################## Convert TimeStamps to Time ########################
echo "Converting TimeStamps etc to Time"
n=0
for ((i=0; i<=${Len}; i++)); do
if [[ ${ar[$i]} =~ "Timestamp" ]] || [[ ${ar[$i]} =~ "MarketTime" ]] ;then
dte=$(date -d @$(echo "${ar[$i]}" | cut -d, -f2) +"%d-%b-%y") # Generate dates from values
#echo "$dte"
key=$(echo "${ar[$i]}" | cut -d, -f1) # Get key to data pair
ar[$i]=$key,$dte
#echo "SSSSSSSSSSSSSS ${ar[$i]}"
fi
let n=n+1
done

########### Empty file & Copy data to file & open file in spreadsheet ##########

echo > /root/my-applications/Yhoo/YhCsv # Empty file
IFS=$'\n'
for i in ${ar[@]}
do
echo $i >> /root/my-applications/Yhoo/YhCsv
done
echo "Saved $epic array to YhCsv file"
gnumeric /root/my-applications/Yhoo/YhCsv # Display results in a spreadsheet

#################################### The End #####################################
# tested on GNU bash, version 3.00.16(1)-release (i486-t2-linux-gnu) & Linux Puppy Wary 5.3 & Gnumeric Spreadsheet 1.10.16. Needs jq at https://stedolan.github.io/jq/ to be installed
# File references at lines 5, 15, 49, 53 will need changing to suit you
# Run with Filename VOD.L

mc2fool
Lemon Half
Posts: 7812
Joined: November 4th, 2016, 11:24 am
Has thanked: 7 times
Been thanked: 3017 times

Re: How to parse Json the easy way

#117484

Postby mc2fool » February 11th, 2018, 10:41 pm

schober wrote:1) This command results in an unreadable json blob!
query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L

2) The solution; ...............

.............is http://jsonviewer.stack.hu/#http://query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L

pochisoldi
Lemon Slice
Posts: 941
Joined: November 4th, 2016, 11:33 am
Has thanked: 31 times
Been thanked: 462 times

Re: How to parse Json the easy way

#117547

Postby pochisoldi » February 12th, 2018, 11:07 am

I've been using the perl JSON module for Yahoo stock quotes.

Here's a stripped down copy of the subroutine I've been using.

#!/usr/bin/perl -w
use JSON qw( decode_json );

sub getyahoo
{
$ticker=$_[0];
$BASEURL="https://query1.finance.yahoo.com/v7/finance/quote?symbols=";
open(WEB,"wget -O - --quiet ".$BASEURL.$ticker." |") || die "failed: $!\n";
@page=<WEB>;
close(WEB);
for $body (@page) {
my $json_data = decode_json($body);
my $json_resources = $json_data->{'quoteResponse'};
my @results = @{ $json_resources->{'result'} };
foreach my $f ( @results ) {
my $price = $f->{"regularMarketPrice"};
my $symbol = $f->{"symbol"}; chomp($symbol);
$line = "$symbol:$price";
print "$line\n";
} # end for each in results
} # End for each in page
}

getyahoo("VOD.L");
getyahoo("BP.L,RDSB.L,SBRY.L,TSCO.L");

Dod101
The full Lemon
Posts: 16629
Joined: October 10th, 2017, 11:33 am
Has thanked: 4343 times
Been thanked: 7534 times

Re: How to parse Json the easy way

#117595

Postby Dod101 » February 12th, 2018, 1:13 pm

schober wrote:1) This command results in an unreadable json blob!
query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L

2) The solution; ............... install jq
https://stedolan.github.io/jq/

3) Now enter this code at the command line
curl 'query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L' | jq '.'

4) Or better save this code & make it executable; invoke with filename1 VOD.L at the command line

#!/bin/bash
qu="query1.finance.yahoo.com/v7/finance/quote?symbols=$1"
curl "$qu" | jq '.quoteResponse.result[]'

5) The code below makes the data "usable" by converting it to csv format. Save the code & make it executable; invoke it with filename2 VOD.L

6) If anyone uses this data source please let us know what you did and what happened.



##### Convert Json data to Csv and save to file. Invoke with filename PRU.L or filename MSFT etc ####
#!/bin/bash
#epic="PRU.L"
epic=$1
curl "query1.finance.yahoo.com/v7/finance/quote?symbols=$epic" > /root/my-applications/Yhoo/vod
echo "Getting data from Yahoo and saving to file"
######################### Get Json data from file & filter through jq #######################
IFS=$'\n'
i=0
ar=()
while IFS= read -r line; do
#printf '%s\n' "$line"
ar[$i]=$line
let i=i+1
done <<< "$(jq '.quoteResponse.result[0]' /root/my-applications/Yhoo/vod)"
Len=${#ar[@]}
unset ar[0] # Delete [ & ] at start and end of array
unset ar[$Len-1]

#for i in ${ar[@]};do echo "$i"; done
echo "Array filled and ready for processing into csv format"
################# Convert each line into csv format ####################
n=0
for ((i=0; i<=${Len}; i++)); do
ar[i]="$(echo "${ar[i]}" | tr -d '[:space:]')" # Delete spaces
ar[i]="$(echo "${ar[i]}" | tr -d '["]')" # Delete quotes
ar[i]="$(echo "${ar[i]}" | tr -d '[,]')" # Delete commas
ar[i]=${ar[i]/:/,} # Change colons into commas
#echo "${ar[i]}"
let n=n+1
done

################## Convert TimeStamps to Time ########################
echo "Converting TimeStamps etc to Time"
n=0
for ((i=0; i<=${Len}; i++)); do
if [[ ${ar[$i]} =~ "Timestamp" ]] || [[ ${ar[$i]} =~ "MarketTime" ]] ;then
dte=$(date -d @$(echo "${ar[$i]}" | cut -d, -f2) +"%d-%b-%y") # Generate dates from values
#echo "$dte"
key=$(echo "${ar[$i]}" | cut -d, -f1) # Get key to data pair
ar[$i]=$key,$dte
#echo "SSSSSSSSSSSSSS ${ar[$i]}"
fi
let n=n+1
done

########### Empty file & Copy data to file & open file in spreadsheet ##########

echo > /root/my-applications/Yhoo/YhCsv # Empty file
IFS=$'\n'
for i in ${ar[@]}
do
echo $i >> /root/my-applications/Yhoo/YhCsv
done
echo "Saved $epic array to YhCsv file"
gnumeric /root/my-applications/Yhoo/YhCsv # Display results in a spreadsheet

#################################### The End #####################################
# tested on GNU bash, version 3.00.16(1)-release (i486-t2-linux-gnu) & Linux Puppy Wary 5.3 & Gnumeric Spreadsheet 1.10.16. Needs jq at https://stedolan.github.io/jq/ to be installed
# File references at lines 5, 15, 49, 53 will need changing to suit you
# Run with Filename VOD.L


I am so very glad that I think I can get through life without having the slightest understanding of this. :D

Dod

kiloran
Lemon Quarter
Posts: 4092
Joined: November 4th, 2016, 9:24 am
Has thanked: 3234 times
Been thanked: 2827 times

Re: How to parse Json the easy way

#117895

Postby kiloran » February 13th, 2018, 3:52 pm

It's good to see the bits of code from schober and poschisoldi (haven't played with Perl for 8 years or more), but I must say I like the jsonviewer URL from mc2fool. I think I'll be using that regularly for Yahoo prices.

--kiloran

schober
Lemon Pip
Posts: 71
Joined: November 30th, 2017, 11:14 am
Has thanked: 1 time
Been thanked: 20 times

Re: How to parse Json the easy way

#118474

Postby schober » February 15th, 2018, 9:40 pm

Thanks for posting your code Poschisoldi. Perl is a foreign country for me so I haven't been able to try it out. I would be interested to know how you use the data.

Jsonview & jq both do the same thing - they make the unreadable readable. They don't convert the data to csv format or turn the timestamps into dates

I see the script as a starting point for further survey and analysis either with more bash or python or spreadsheets.

pochisoldi
Lemon Slice
Posts: 941
Joined: November 4th, 2016, 11:33 am
Has thanked: 31 times
Been thanked: 462 times

Re: How to parse Json the easy way

#118489

Postby pochisoldi » February 15th, 2018, 11:49 pm

schober wrote:Thanks for posting your code Poschisoldi. Perl is a foreign country for me so I haven't been able to try it out. I would be interested to know how you use the data.


I have a cgi script executed on demand on a personal web server which scrapes share data plus price data from several sources (three sets of pension fund prices, and an ISA provider's unit prices), the current market currency rate for three currencies, and as a bonus some market indices.

It then displays this as a simple web page. The data collected is also used to create a downloadable csv file. This is in a fake FT.com csv format.
I can then download that data and import it into hleOFXquotes which then turns it into an OFX file which then gets exported to MS Money 2002 every Saturday morning, so I have a weekly share/fund prices update.

My pension/ISA fund feeds come directly from scraping single pages from each of the providers (one subroutine to handle two sets of fund prices from one pension provider, another from a 2nd pension provider, and then another subroutine to scrape data for the ISA).
Currency rates are scraped from Bloomberg

The webpage also scrapes market indices using the Yahoo API, and does a back of a fag packet calculation for one stock which isn't traded in £.
(but neither these values or currency updates make it into the csv file).

PochiSoldi

mc2fool
Lemon Half
Posts: 7812
Joined: November 4th, 2016, 11:24 am
Has thanked: 7 times
Been thanked: 3017 times

Re: How to parse Json the easy way

#118582

Postby mc2fool » February 16th, 2018, 1:24 pm

schober wrote:Thanks for posting your code Poschisoldi. Perl is a foreign country for me so I haven't been able to try it out. I would be interested to know how you use the data.

Jsonview & jq both do the same thing - they make the unreadable readable. They don't convert the data to csv format or turn the timestamps into dates

I see the script as a starting point for further survey and analysis either with more bash or python or spreadsheets.

If I understand your script correctly then it doesn't produce CSV format but rather just a list of fieldname-value pairs. Getting it to produce proper CSV (one line per record) would be advantageous for letting you deal with multiple quotes with one get*, as the single-quote script you've got will be quite slow if you invoke it lots of times.

* e.g. http://jsonviewer.stack.hu/#http://query1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L,BP.L,RDSB.L

Of course, the format and methods you use will depend on what you want to do with the data, and the closer you are to a "native" form the easier it is to handle the JSON. I have three implementations I use:

A JavaScript implementation built into my on-local-disk (file://) web pages that shows me the latest price/OHLC, change, time, name, etc of a few dozen stocks, indices and currencies.

A Perl implementation that runs as a cgi script on personal web servers that updates the saved price history of several hundred stocks and indices for various analyses (don't ask 8-))

A Visual Basic implementation embedded into a couple of Excel workbooks that directly updates within the workbooks the prices of a few score stocks, indices and currencies I either own or am interested in following. (Similar to HYPTUSS, but my own implementation).

Needless to say, each have their advantages and disadvantages, but each is the best fit for their required end purpose (otherwise I wouldn't have written three ;)). The JavaScript is, of course, the most native and easiest for handling the JSON, followed by the Perl, using the aforementioned JSON module. However, the one thing they do have in common is using a single get to fetch the data for all the quotes, and so are pretty zippy.

pochisoldi
Lemon Slice
Posts: 941
Joined: November 4th, 2016, 11:33 am
Has thanked: 31 times
Been thanked: 462 times

Re: How to parse Json the easy way

#118599

Postby pochisoldi » February 16th, 2018, 2:23 pm

mc2fool wrote:
schober wrote:Thanks for posting your code Poschisoldi. Perl is a foreign country for me so I haven't been able to try it out. I would be interested to know how you use the data.

Jsonview & jq both do the same thing - they make the unreadable readable. They don't convert the data to csv format or turn the timestamps into dates

I see the script as a starting point for further survey and analysis either with more bash or python or spreadsheets.

If I understand your script correctly then it doesn't produce CSV format but rather just a list of fieldname-value pairs. Getting it to produce proper CSV (one line per record) would be advantageous for letting you deal with multiple quotes with one get*, as the single-quote script you've got will be quite slow if you invoke it lots of times.


The script I posted was to show how I get the data - as posted it can handle one or more quotes in one go, hence the two "demo" calls:
getyahoo("VOD.L");
getyahoo("BP.L,RDSB.L,SBRY.L,TSCO.L");

In reality the subroutine I use doesn't print the data, it retrieves a single quote, and pushes it into a perl array in a standardized format with other data for later processing.

So my cgi script does this:
Collect share price data/market indices from Yahoo (one call per share), each call pushes data into the array
Collect personal pension fund prices from provider 1 - gets called once, does one get for all and pushes one price into the array
Collect Stakeholder pension fund prices from provider 2 - gets called once, does one get for all and pushes two prices into the array
Collect pension fund prices from provider 3 - gets called once, does one get for all and pushes three prices into the array
Collect ISA fund prices from provider 4 - gets called once, does one get for all and pushes four prices into the array
Collect currency rates from Bloomberg - one call for each of three currencies, and push the information into the array
(a total of

I then push dummy entries into the array. (bodge 1)

The array gets sorted into alphabetical order.

The sorted array gets turned into HTML - the dummy entries are stripped and replaced by blank lines.

The sorted array gets turned into a script (bodge 2) which delivers a ft.com CSV format file.
Each item in the array has a marker to indicate whether the entry gets added to the CSV or not.

Bodge 1 was done because it was quick and dirty
Bodge 2 was done because (i) I couldn't persuade my webserver to deliver the file with the correct MIME type and (ii) the CSV file needed to be generated "on the fly" because none of the scraped data gets stored anywhere.

As far as speed is concerned, the version of my script which uses json takes less than 10 seconds to do its stuff. The previous version which scraped the prices from a website took a good 45 seconds or more.

I'm planning on tidying up the script at some point, making my "getyahoo" subroutine into a "one get for all", but that will mean using some kind of internal lookup table to add the "other data" to that retrieved via json.

PochiSoldi

mc2fool
Lemon Half
Posts: 7812
Joined: November 4th, 2016, 11:24 am
Has thanked: 7 times
Been thanked: 3017 times

Re: How to parse Json the easy way

#118603

Postby mc2fool » February 16th, 2018, 2:36 pm

pochisoldi wrote:The script I posted was to show how I get the data - as posted it can handle one or more quotes in one go, hence the two "demo" calls:
getyahoo("VOD.L");
getyahoo("BP.L,RDSB.L,SBRY.L,TSCO.L");

Yes, I know, I was referring to schober's script, not yours.

schober
Lemon Pip
Posts: 71
Joined: November 30th, 2017, 11:14 am
Has thanked: 1 time
Been thanked: 20 times

Re: How to parse Json the easy way

#118686

Postby schober » February 16th, 2018, 7:31 pm

Thanks for the replies Mc2fool & Polchisochi; it seems you are both on a higher paygrade than me!

AFAICS "curl "query1.finance.yahoo.com/v7/finance/quote?symbols=$epic" > /ro ...."
works if $epic is a single epic or several ie PRU.L,BLT.L,MSFT etc. Curl seems to do it all at one go.

If the zero in this script line "done <<< "$(jq '.quoteResponse.result[0]' /root/......"
is deleted to give "done <<< "$(jq '.quoteResponse.result[]' /root/...." then pru, blt & msft data are all displayed in the spreadsheet (with the zero only pru data is shown). The script is invoked with filename PRU.L,BLT.L,MSFT

AFAICS the final saved data is in csv format; this is a sample from the YhCsv file

language,en-US
quoteType,EQUITY
currency,GBp
exchangeDataDelayedBy,20
regularMarketChangePercent,-1.28911

The final line tells Gnumeric to display the data which appear as two columns in the spreadsheet. Gnumeric complies without a quibble!

mc2fool
Lemon Half
Posts: 7812
Joined: November 4th, 2016, 11:24 am
Has thanked: 7 times
Been thanked: 3017 times

Re: How to parse Json the easy way

#118695

Postby mc2fool » February 16th, 2018, 8:14 pm

schober wrote:... then pru, blt & msft data are all displayed in the spreadsheet

Maybe, but sequential data in a spreadsheet is pretty inconvenient and much more difficult to process/analyse than tabular data.

AFAICS the final saved data is in csv format; this is a sample from the YhCsv file

language,en-US
quoteType,EQUITY
currency,GBp
exchangeDataDelayedBy,20
regularMarketChangePercent,-1.28911

That's not CSV format, it's just a list of fieldname-value pairs. (Just 'cos each line has a comma in it doesn't make it CSV!) The essence of comma separated values is that an entire record is on one (and each) line, with all of the field values separated by commas. E.g.

symbol,language,quoteType,currency,exchangeDataDelayedBy,regularMarketChangePercent
PRU.L,en-US,EQUITY,GBp,20,-1.28911
BLT.L,en-US,EQUITY,GBp,20,9.87654
MSFT,en-US,EQUITY,USD,0,1.23456
etc

See https://en.wikipedia.org/wiki/Comma-separated_values, but definitions aside, as I say, the real point is that the format above will be much easier to do anything with than the sequential list of fieldname-value pairs you currently have.

0x3F
Posts: 42
Joined: November 4th, 2016, 11:12 am
Has thanked: 4 times
Been thanked: 8 times

Re: How to parse Json the easy way

#119450

Postby 0x3F » February 21st, 2018, 11:22 am

schober wrote:I see the script as a starting point for further survey and analysis either with more bash or python or spreadsheets.


My (stripped down) python implementation uses it's json library:

Code: Select all

yahooJSON = requests.get(https://quVery1.finance.yahoo.com/v7/finance/quote?symbols=VOD.L,PRU.L,BP.L)
yahooPriceInfo = json.loads(yahooJSON.text)

pricesList = yahooPriceInfo["quoteResponse"]["result"]

# Same order as in URL, where [0] is First item in url, [1] 2nd in url etc.
vodPrice = pricesList[0]['regularMarketPrice']
pruPrice = pricesList[1]['regularMarketPrice']
bpPrice = pricesList[2]['regularMarketPrice']


It does the fetch in one go, and easy to update to pick out required item (opening price, share currency, etc). It's straight forward to implement a look-up to remove the hard coded 0,1,2 indexes.

I use this method in both my LibreOffice spreadsheet (price download macro) and as a standalone python script.

0x3F

PS Thanks to original poster, I can see myself using that for quick lookups on the command line.

schober
Lemon Pip
Posts: 71
Joined: November 30th, 2017, 11:14 am
Has thanked: 1 time
Been thanked: 20 times

Re: How to parse Json the easy way

#119531

Postby schober » February 21st, 2018, 7:42 pm

Thanks for the clarification Mc2fool. I haven't decided yet how i'm going to use the data. That will determine the format which seems most suitable. I take the point that tabular data is better than linear. If more than 2dimensions are needed then Python beckons.
Thanks 0x3F for your code.


Return to “Financial Software - Discussion”

Who is online

Users browsing this forum: No registered users and 6 guests