Now you can request additional data and/or customized columns!

Try It Now!

ISO 4217 Currency Codes

Certified

core

Files Size Format Created Updated License Source
2 156kB csv zip 6 years ago 3 years ago Open Data Commons Public Domain Dedication and License v1.0 SIX Interbank Clearing Ltd (on behalf of ISO)
List of currencies and their 3 digit codes as defined by ISO 4217. The data provided here is the consolidation of Table A.1 "Current currency & funds code list" and Table A.3 "Historic denominations". Note that the ISO page offers pay-for PDFs but also links to which does provide them in read more
Download Developers

Data Files

Download files in this dataset

File Description Size Last changed Download
codes-all 17kB csv (17kB) , json (63kB)
currency-codes_zip Compressed versions of dataset. Includes normalized CSV and JSON data with original data and datapackage.json. 25kB zip (25kB)

codes-all  

Signup to Premium Service for additional or customised data - Get Started

This is a preview version. There might be more data in the original version.

Field information

Field Name Order Type (Format) Description
Entity 1 string Country or region name
Currency 2 string Name of the currency
AlphabeticCode 3 string 3 digit alphabetic code for the currency
NumericCode 4 number 3 digit numeric code
MinorUnit 5 string
WithdrawalDate 6 string Date currency withdrawn (values can be ranges or months

Integrate this dataset into your favourite tool

Use our data-cli tool designed for data wranglers:

data get https://datahub.io/core/currency-codes
data info core/currency-codes
tree core/currency-codes
# Get a list of dataset's resources
curl -L -s https://datahub.io/core/currency-codes/datapackage.json | grep path

# Get resources

curl -L https://datahub.io/core/currency-codes/r/0.csv

curl -L https://datahub.io/core/currency-codes/r/1.zip

If you are using R here's how to get the data you want quickly loaded:

install.packages("jsonlite", repos="https://cran.rstudio.com/")
library("jsonlite")

json_file <- 'https://datahub.io/core/currency-codes/datapackage.json'
json_data <- fromJSON(paste(readLines(json_file), collapse=""))

# get list of all resources:
print(json_data$resources$name)

# print all tabular data(if exists any)
for(i in 1:length(json_data$resources$datahub$type)){
  if(json_data$resources$datahub$type[i]=='derived/csv'){
    path_to_file = json_data$resources$path[i]
    data <- read.csv(url(path_to_file))
    print(data)
  }
}

Note: You might need to run the script with root permissions if you are running on Linux machine

Install the Frictionless Data data package library and the pandas itself:

pip install datapackage
pip install pandas

Now you can use the datapackage in the Pandas:

import datapackage
import pandas as pd

data_url = 'https://datahub.io/core/currency-codes/datapackage.json'

# to load Data Package into storage
package = datapackage.Package(data_url)

# to load only tabular data
resources = package.resources
for resource in resources:
    if resource.tabular:
        data = pd.read_csv(resource.descriptor['path'])
        print (data)

For Python, first install the `datapackage` library (all the datasets on DataHub are Data Packages):

pip install datapackage

To get Data Package into your Python environment, run following code:

from datapackage import Package

package = Package('https://datahub.io/core/currency-codes/datapackage.json')

# print list of all resources:
print(package.resource_names)

# print processed tabular data (if exists any)
for resource in package.resources:
    if resource.descriptor['datahub']['type'] == 'derived/csv':
        print(resource.read())

If you are using JavaScript, please, follow instructions below:

Install data.js module using npm:

  $ npm install data.js

Once the package is installed, use the following code snippet:

const {Dataset} = require('data.js')

const path = 'https://datahub.io/core/currency-codes/datapackage.json'

// We're using self-invoking function here as we want to use async-await syntax:
;(async () => {
  const dataset = await Dataset.load(path)
  // get list of all resources:
  for (const id in dataset.resources) {
    console.log(dataset.resources[id]._descriptor.name)
  }
  // get all tabular data(if exists any)
  for (const id in dataset.resources) {
    if (dataset.resources[id]._descriptor.format === "csv") {
      const file = dataset.resources[id]
      // Get a raw stream
      const stream = await file.stream()
      // entire file as a buffer (be careful with large files!)
      const buffer = await file.buffer
      // print data
      stream.pipe(process.stdout)
    }
  }
})()

Read me

List of currencies and their 3 digit codes as defined by ISO 4217. The data provided here is the consolidation of Table A.1 “Current currency & funds code list” and Table A.3 “Historic denominations”.

Note that the ISO page offers pay-for PDFs but also links to http://www.currency-iso.org/en/home/tables.html which does provide them in machine readable form freely.

Data

The data provided (see data/codes.csv) in this data package provides a consolidated list of currency (and funds) codes by combining these two separate tables:

Preparation

The script requires recode package to be istalled. Install it by running:

sudo apt install recode

Run the following script to download and convert the data from XML to CSV:

cd scripts/
./runall.sh

The raw XML files are stored in ./archive. The cleaned data are ./data/codes-all.csv.

Version

The current tables have a published date of 28 March 2014 (as indicated in the XML files).

License

Placing in the Public Domain under the Public Domain Dedication and License. The original site states no restriction on use and the data is small and completely factual.


Keywords and keyphrases: iso 4217, iso currency codes, iso 4217 currency code, iso4217, iso 4217 currency codes, iso-4217.
Datapackage.json

Request Customized Data


Notifications of data updates and schema changes

Warranty / guaranteed updates

Workflow integration (e.g. Python packages, NPM packages)

Customized data (e.g. you need different or additional data)

Or suggest your own feature from the link below