<aside> 🚀 [Mini-Course Alert] We try to cover all you need to know! If you're new to Cloud Functions or just need a refresher, we've designed a free mini-course for you. This comprehensive course covers essential topics. [TIP] This course can also serve as a reference when dealing with Cloud Storage.
</aside>
Having developed the Python code to extract, transform, and store data in previous chapters, our next step is to prepare this code for deployment to Google Cloud Functions, enabling automatic execution in a production environment (GCP).
In the latest version of our main.py file, we had the following code below the upload_df_to_bigquery function:
File: main.py
api_key = 'USE YOUR API KEY'
# Create a dictionary with the coordinates of 5 locations:
locations_dict = {
'Thessaloniki, GR': {'lat': '40.6403', 'lon': '22.9439'},
'Paris, FR': {'lat': '48.85341', 'lon': '2.3488'},
'London, GB': {'lat': '51.50853', 'lon': '-0.12574'},
'Dubai, AE': {'lat': '25.276987', 'lon': '55.296249'},
'Los Angeles, US': {'lat': '34.0522', 'lon': '-118.2437'},
}
weather_data = get_weather_data(locations_dict, api_key)
current_weather = pd.DataFrame()
for key, value in weather_data['current'].items():
current_weather = pd.concat([current_weather, transform_current_weather_data(value)])
forecast_weather = pd.DataFrame()
for key, value in weather_data['forecast'].items():
forecast_weather = pd.concat([forecast_weather, transform_forecasted_weather_data(value)])
# Store the current weather data:
bucket_name = "raw_weather_api_data" # REPLACE IT WITH YOUR GLOBALLY UNIQUE BUCKET NAME
for key, value in weather_data['current'].items():
folder_path_current = f'current_weather/{key}'
json_data = value
folder_path_current = folder_path_current
upload_json_to_gcs(json_data, bucket_name, folder_path_current)
# Store the forecasted weather data:
for key, value in weather_data['forecast'].items():
folder_path_current = f'forecasted_weather/{key}'
json_data = value
folder_path_current = folder_path_current
upload_json_to_gcs(json_data, bucket_name, folder_path_current)
upload_df_to_bigquery(dataframe=current_weather, project_id='YOUR_GCP_PROJECT_ID', dataset_id='weather_api', table_name = 'current_weather')
upload_df_to_bigquery(dataframe=forecast_weather, project_id='YOUR_GCP_PROJECT_ID', dataset_id='weather_api', table_name = 'forecasted_weather')
Put these commands in a main() function:
Start the Function by Handling Incoming Data:
Add the following lines at the start of the main() function:
try:
request_body = request.get_json()
except:
request_body = json.loads(request)
What This Code Does: It tries to read data sent to your function. If the data is sent the way Google Cloud Functions expects, get_json() works. If you're testing locally and the data format is different, the except block handles it by manually converting the data from JSON.
Why We Do This: It makes sure your function can handle data both while testing on your computer and when running on Google Cloud.
End the Function by Sending a Success Message:
What to Add: At the end of your function, add this line:
return '200, Success'
What It Does: This sends a message back saying everything went well.
Why It's Important: It’s like giving a thumbs-up to whatever called your function, letting them know everything worked as expected.
Setup for Local Testing:
main.py file, below the main() function:if __name__ == "__main__":
data = {} # This is used as the request body
payload = json.dumps(data)
print(main(payload))