Data Investigation

Milestone #1: Data Investigation

My first step is to investigate my data options for this project. As discussed in my plan, I am considering Google Streetview data and LiDAR data. The Streetview data is my first choice but I realized that that data might be different from what I expect or have weird complications that make it difficult or impossible to do what I have in mind. I wanted to consider alternatives, and there's a lot that interests me about LiDAR data. Of course that data might be impossible to work with too. In any case, I needed to find out these things right away while it is still easy to change course on this project.

The summary of Google Streetview data is that it is easy to work with and close to what I expected. They provide a convenient API that is properly documented. Unfortunately, the depth data discussed in this blog post does not come from the API, and that information is compressed in a format I have not yet parsed. The author of that post does provide C++ code for doing so; I am optimistic that I will be able to translate that to Python and/or integrate their process into my code.

LiDAR data is also well documented but extremely complex. I've worked with complex data before and am confident I can manage this if I put in the time. My objection is that taking the project in that direction would take a good portion of the class. I would have less time to learn about the topics I want to be learning about.

Additionally, I feel the challenges I would face with the Google Streetview data is resonating with me in a way that the LiDAR data challenges are not.

My conclusion is that I will use the Google Streetview data for this project. Sometime after the semester is over I might spend more time with the LiDAR data and get some experience working with it. It would be a great choice for a future project.

Working with Google Streetview data

Using the Google Streetview API is easy in Python.

First, import a few packages.

In [1]:
import requests
from IPython.display import Image

The API documentation says that users should use an API key. The free plan allows for 25K requests per day with a maximum image resolution of 640x640.

In [2]:
with open('../secrets/google_street_view_image_api.txt.nogit', 'r') as f:
    api_key = f.read()

This is the basic URL format I can use for image metadata:

In [3]:
API_METADATA_URL = ("https://maps.googleapis.com/maps/api/streetview/metadata?"
                    "location={0},{1}&key={2}")

I can pick a location using latitude and longitude. Using my API key, I can construct the URL.

In [4]:
location = (40.7293934, -73.9934632)

url = API_METADATA_URL.format(location[0], location[1], api_key)

With the Python requests library I can easily request the URL and parse the response.

In [5]:
response = requests.get(url)

response.json()
Out[5]:
{'copyright': '© Google, Inc.',
 'date': '2017-09',
 'location': {'lat': 40.72939870242668, 'lng': -73.99343724185765},
 'pano_id': '4ElQmDQdNK49Kc_Mxjp1Tw',
 'status': 'OK'}

The OK status tells me if I make a request for streetview data I will get a valid response. The other information is also useful but I won't get into that here.

Here is the URL for image data:

In [6]:
API_URL = ("https://maps.googleapis.com/maps/api/streetview?"
           "location={0},{1}&size=640x640&heading={2}&fov=90&pitch={3}&key={4}")

I need to specify a viewing direction with heading and pitch:

In [7]:
heading = 354
pitch = 0

I can again construct the URL and fetch the data.

In [8]:
url = API_URL.format(location[0], location[1], heading, pitch, api_key)

response = requests.get(url)

Image(response.content)
Out[8]:
The front of 721 Broadway, a city building with a purple NYU flag flying over the front door. There is an apartment building in the background and cars on the street.

This is the current home of ITP.

I can change the heading to look in another direction.

In [9]:
heading -= 180

url = API_URL.format(location[0], location[1], heading, pitch, api_key)

response = requests.get(url)

Image(response.content)
Out[9]:
City scene from middle of street with cars on the road, a sidewalk and buildings next to the road, and a McDonalds restaurant.

If I change the pitch I can look up or down.

In [10]:
pitch = 90

url = API_URL.format(location[0], location[1], heading, pitch, api_key)

response = requests.get(url)

Image(response.content)
Out[10]:
View looking up from the center of a city street. There are clouds and the buildings are reaching towards the sky.

Using a FOV (field of view) of 90 degrees I can make 6 queries to fetch pictures for 6 faces of an imaginary cube. These 6 pictures can be assembled into the equirectangular projection format used in 360 video.

I have experience working with equirectangular projections from my efforts extending Camera3D to support 360 videos. From deriving the equations for optimal sizing I know that 640x640 cube faces will get me 360 video resolution of about 2K. That would be a little bit pixelated in a 360 video player but it should be more than enough for my purposes here.

Depth data

What about the depth data? Other projects that used this data employed depth data. That doesn't come from the API. It comes from specific URL requests to Google's servers.

In [11]:
url = "http://maps.google.com/cbk?output=json&ll={0},{1}&dm=1".format(*location)

response = requests.get(url)

response.json()
Out[11]:
{'Data': {'copyright': '© 2018 Google',
  'image_date': '2017-09',
  'image_height': '8192',
  'image_width': '16384',
  'imagery_type': 1,
  'tile_height': '512',
  'tile_width': '512'},
 'Links': [{'description': 'Broadway',
   'panoId': 'RMvRGNLRu7QOyXIb6i4Aqw',
   'road_argb': '0x80fdf872',
   'yawDeg': '31.23'},
  {'description': 'Broadway',
   'panoId': 'h43_C8RtuVS-eyaBTeDuLw',
   'road_argb': '0x80fdf872',
   'yawDeg': '211.38'}],
 'Location': {'best_view_direction_deg': '254.966',
  'country': 'United States',
  'description': '725 Broadway',
  'elevation_egm96_m': '17.854675',
  'elevation_wgs84_m': '-14.852387',
  'lat': '40.729399',
  'lng': '-73.993437',
  'original_lat': '40.729393',
  'original_lng': '-73.993463',
  'panoId': '4ElQmDQdNK49Kc_Mxjp1Tw',
  'region': 'New York',
  'streetRange': '725',
  'zoomLevels': '5'},
 'Projection': {'pano_yaw_deg': '211.21',
  'projection_type': 'spherical',
  'tilt_pitch_deg': '0.24',
  'tilt_yaw_deg': '-123.35'},
 'model': {'depth_map': 'eJzt1X10U-UBx_G0tV0tbVqw2NKuAkJLoVVEQLBJvDeN-AYIMgQVN6eAZ9MpsIlHy-hFOhCFDXQycAIOUXAHpiJIR3LDI1bqOIjsUFnBIUWqwBQRChyr-LKbtKFpmpf78tz7pMnv8weFpi_3eb6_ltRbTaZEU0KqCQAAAAAAAAAAAAAAAAAAAAAAAACUSGyH9dOAnhJVYf3UoI266lhCp0cvPGbQuehVHiuIfka07_wbSIrgItYPqIaR6TvhCCI1b9ffH-sHl4VN_E4xASXhg_WP_h2wjB_FE1ARPlz_KF0B6_QXsL4If-rTR-ofXSNg3TwA6-vw0theTv8o2QDr3MGwvREK7eX2Z70B1qFDYnYjlOIr6M9uAqwjh8XiQujFV9afyQRYB47I6AuhWl9pf8MnwLquHIZeCOX8yvsbOQHWZWUy7D6o11fX36AFsM4qmyG34UE_v8r-RiyAdVX59L-LFjrkV91f9wWwjqqAzjfRSo_6WvrrugDWSRXR8R7a6JNfU3_9FsC6qDJ63YI_nfJr7K_TAlgHVUiXO2hPr_ya--sxANY9ldLhCgLoll97f_oLYJ1TMdoX0IF--Wn0pzwA1jWVo3v-DnSsT6c_1QGwjqkCzeN3oGt9Sv0pLoB1SzWoHT4InfPT6k9rAKxTqkLp7EHoXZ9ef0oLYJ1SHSpHD0L__BT70xgA65AqUTh5MAbkp9lf-wBYd1SLQusgjMhPtb_WAbDOqBqV3AEMqU-5v8YBsM6oGp3i7RiUn3J_TQNgXVE9WtHbGJWfdn8NA2AdUQN63VsZlp96f_UDYB1RA4rlvYzLHz39WTfUgmZ7k6H56fdXOwDWDbWgWt_Q_Dr0VzkA1g216Lz59eivbgCsG2rRefOjPw3BT5QWr_1VDYB1Qy0o9jc4vz791QyAdUMtQvRXMQCj8-vUX8UAWDfUglp_w_OjPw2h-isdgPH59eqvfACsG2pBqT-D_OhPQ8j-igbAIr9u_RUPgHVDLbT2v4VZfv36Kx0A64ZahO4vbwDmLhL077TQX_sAWDfUIkx_OQMwoz_6x15_ZQNg3VALbf3N6I_-sdhf0QBYN9QiXP-ukQ6ehP7oH5v9lQyAdUMttPRPiuP-3dA_tvtHGEA3vwGwbqgF-qO_yv5J6I_-sds__ADQPwn90T-W-4cdQFz0t6E_-ofJH-P9ww0gPvr3Rf_gstA_DvqHHkB89O-K_nHevyf6o3_89g81gCz0R3_0j_3-IQaA_uiP_nHbPwv946V_0AHEe__L0D-e-yehf4z3N6F_uAGgP_qjP_rHZ_8k9Ed_9I_X_knoHzf9u6J_YP8s9Ed_9Ef_2Ox_iQf6-_dvvwD0j7v-OTltp--D_uiP_lHQ36D86I_-Of3QH_3RP377XxgA-qM_-sd3_0T0j7P-Of79s9Af_dEf_WOifzb6o3_o_tle6B_7_fO8pb39s7N9_buif8z3z2uF_uiP_ugvvRf947u_R0v_YUPRX-qfg_5e2Xne_pehf4z0z_bv36tXr479pXf2Qv9Y7n-5D_qjf6j-V_h4-3vfxEd_i0dL6NEOhyPG-ptk9ZeyFxdf06q_Vyz3z20xRHKdZJDE83fPW7vdLv3p6V9mKvMy3cQ0oEbF119__YgRI4dLTMUtWl_x9W99b2mpp3ppm5jsny4Z7JMb4Ko2Vg_f-5kG1MgTfmAbTtL6iq9_y497aY9gYqd_ZlieWaQHziE3FvoHiRr8lYLCDgoKCjp9_5I24ScQAdOIGhR0FPyFi1MvDupqn87Wf8CAAemyyN8G25SqpHaUGLRzkA8MLur7BzzvAD_hZhCbvwoyeJ6XlTUj1AvXtsgIFIX98yWydxxI3i8LH9Zd5eqQTbuAe4uG_ile-QH07O9RVFTEOm9ElGOHxaB_SkiKxyA3uUfA_52sI4fhORn10GF099C__6U-oQcgYxPtRhEidWDr0Fin9kMppor0Hglt6Pb_SQB1KwhBduegenv4vhSr7H4Xf6UXy_Shhave9lHe4gmByYOgtAIq5QMZnT0EGWtI0JheVnk1ZCyAygzohve5cAza0ZN9ftpC8z1HX_kL5E6g3QyU7kBZeHlfM-SJNDbvSPsKorK8j5IFqBwCxewR4isVJnvwFaiaQVR296NiAoq2QKW5DvU95C9A_QyitbsfLROISF1mA-J7KVyAiiGEiE7_KFp0hv56nV3dBOROIVmvx6YtyvvreXRtC4hAzwenLGr763909G8Rhf0NOjn6t4qm_saeHP1bRUd_FidHfx_G_ZmdG_0vYNWf9bnRv43R_VmftxX6-zGqP-tzBkB_fzr3Z3284NC_HX36sz5VBOgfgGJ_1keRCf070taf9dMrh_4hKOrP-mE1Qv9IAvOzfh4doD_IXwHr5wRdoT-0QH4AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA6HxMAbZkFjqXD3nO6jgnkKofevA3NgrcmMubxC9GHXAmFFxj7_-NQFYnPmg9s_Nw2esFk-2ZZwSyZ-hL1k_yD5Wd332HffQXDWXPlTxnHSh9fqn0-bUF9U7Tgk8sd_8gkCd_uIS_5cMky6b7N7kapX_ft-USvqLrZ66nz6y0DmsSyPqeefyvBbvY8ONKq-0rgfyHy-PnfyRwMyekuUuG1TsnWH5l_678r9ty8h4h45fOswz9qLh88OwZ4jpzb2u19P3OzOvOVy58zNXw6Vrrl6cFUlOSy8-YIZ3jm_m27696r8xRn2mfeup5sfHeNywV0nN_1cPML90rcKtvy3WPnb_PueLWfPs_k5usZ48IpCl1uuXs75eWL6gdS14vyuZ-ue1a56JJb9s_3_GUO_NBgczYU2G579UUh61iTfWiHwVy6zO_swy9aqCjZm6Oc9Sgr8T0TwXytnM4XzW9dvidEzNszfsFMv7afvz0h4aSviuGcCPu3-ScltNQPmrzYfH4vaKl_KhATpRl8t2u3ucSb0gpe0G6ny6H7uKXpfxCnC_debeCmZbStD4O4f1e4pvvHLQ1V1aSvl1m8usfKHHu2l5kW7dTIJb-Xfh19hOuA48uF5uk-7yzsopvquhJnudvIrf_vMry2I4d5fYVo7hrPupBTpyfUFaT9Vr5ebLB-t0pgQzqOsPy1pxcx2aXeat4YLP72x8ryaBtG7mPB5u5ixrGk-790izDxvVxnD4ocON3Jru_rKl33j22wH76aoHjPlxtW_uvFNfclPf4ZzbcLj5Q2GA53iyQJXM4fsy_7xKbLlrlmiqd4dWRSXxO3Z9s5-4RuCzzLFfvtH6OF1-6UyycvFi8QnqGUfwmfsUfNoo5a1a59kv9380Q-BtnPyA2nhpjq31ZII7FI_nmOc-KyfYhtuQtAum1ZzSfsalSfP-hp9x3TBHIaLPZPmp4nauq9HEx82uB3PFKNT-vfLPrzQ0LbMd_I5Ajhfu45nfeELdOr3aX_6-S7Lz9Bvvh3SfFI88-7KpuFMjPKo5ysyaY3d2za1yOeoEkvlTPHSra5-pjznGn7hXIhkkL-SU9iZj6lmi7ub6S_P3Rrbx7Wr349iPF4qHPBFL3YjU35XuB21-aaR2ffsLJT2zmb5uzUDzaM13MOyuQl88t4r_-VuAW97hZ3Lj7E2dFSU_72nFF4rSt28WRnwvk86kf84uOLnfN_Wy_OPa4QIYsncJXf2B2vzL1StvGFQLZllrLP77zmDjl8Gnb8bxKUvLCs3Zu-THxOtMB6-4PBHI-eQJfd3IW13v30G01q__oLHg5rfzPsxLckxcts80qkX4e6_by6W8dEscefFKsOST9fBxZw_f7rtH1twPzbCN_K5Dhtiruv__Idz_x8S7XvXsEMm1cHbe8okFcVjjXlXtMICnzM_hTbqeY2LjXsvJLgdw2eB7v5B66btc9JveGY4-RqiU77LnfzBTzVxe707cLpHbfpfZ1S9LcvfssdB09IJCTd6fxkyfWietnzxYnSj9j6_MncaPX7hJTRv_FuuWwQC7_4ine_kiu--yqgeKH0nm2567gjrx2pe10jUDW3f-wZd27tvJxze-7H6xNd1-8pJK8Yub5_gUJ7i7dKsTdUq_TT0_i7FvOiQcff9paKH0_-8lN_P8B6W6T8g'}}

The 'depth_map' at the end of the json response is compressed data that has been encoded in base64. I can decode base64 but I have not yet uncompressed the data with Python's zlib library. Next step for me is to experiment with the C++ code I found and figure out why it works and convert it to Python.

More importantly, I need to write Python code to systematically access this data and keep it organized. I've got some ideas for how to do that and will start putting that together tomorrow during our class.

Comments