Downloads

petropy.ul_lands_download(save_dir=None)[source]

Downloads las files from University Lands Texas

This function downloads files from the university lands ftp website located at publiftp.utlands.utsystem.edu. It inventories readable logs into a csv file containing header data in the save_dir. This inventory data is also returned as a DataFrame.

The bulk of this script is provided courtesy of Jon Reynolds, with Glacier Geosciences at:

http://www.glaciergeosciences.com/

Parameters:save_dir (str (default None)) – path to directory to save data. defaults to data folder within petropy
Returns:DataFrame of header data for all logs downloaded and read.
Return type:DataFrame

Examples

>>> import petropy as ptr
>>> ptr.ul_lands_download()
>>> import petropy as ptr
>>> p = r'path/to/my/folder/'
>>> ptr.ul_lands_download(p)

Note

Function takes approximately twelve hours to scan ftp site, download, and inventory 30 GB of log data. YMMV depending on internet and processor speed.

petropy.kgs_download(save_dir=None)[source]

Downloads las files from Kansas Geologic Society

This function downloads files from the Kansas Geologic Society. These are zip files inside zip files, so the function parses out all las files and saves them in the folder input save_dir or with package data in the folder data/kgs. It inventories readable logs into a csv file containing header data in the save_dir. This inventory data is also returned as a DataFrame.

Parameters:save_dir (str (default None)) – path to directory to save data. defaults to data folder within petropy
Returns:DataFrame of header data for all logs downloaded and read.
Return type:DataFrame

Examples

>>> import petropy as ptr
>>> ptr.kgs_download()
>>> import petropy as ptr
>>> p = r'path/to/my/folder/'
>>> ptr.kgs_download(p)

Note

Function takes approximately one hour to download, unzip, and inventory 20GB of log data. YMMV depending on internet and processor speed.

petropy.create_log_inventory_table(save_dir, folder_copy=None)[source]

Scans all folders and subfolders (recursive scan) for las files, and opens them as a petropy.Log object. Extracts header data and curve names. Returns DataFrame of data after saving to a csv file in the save_dir folder.

Parameters:save_dir (str) – path to folder for recusive scan
Returns:DataFrame of header data for all logs downloaded and read.
Return type:DataFrame

Example

>>> import petropy as ptr
>>> p = r'path/to/folder/'
>>> df = ptr.create_log_inventory_table(p)
>>> # filter logs with triple-combo for processing
>>> tc_df = df[df.GR_N == 'Y' & df.RESDEEP_N == 'Y' &
...            df.NPHI_N == 'Y' & df.RHOB_N == 'Y']
>>> # print count of useable logs
>>> print(len(tc_df))