kevinselhi

kevinselhi

Geek Repo

Location:Berkeley, CA

Github PK Tool:Github PK Tool

kevinselhi's repositories

ai-platform-samples

Official Repo for Google Cloud AI Platform

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

annoy

Approximate Nearest Neighbors in C++/Python optimized for memory usage and loading/saving to disk

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

atari-representation-learning

Code for "Unsupervised State Representation Learning in Atari"

Language:Jupyter NotebookLicense:MITStargazers:0Issues:1Issues:0

babyai

BabyAI platform. A testbed for training agents to understand and execute language commands.

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:0Issues:0

bento

Everything you need to know about web development. Neatly packaged.

License:MITStargazers:0Issues:2Issues:0

botorch

Bayesian optimization in PyTorch

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

faiss

A library for efficient similarity search and clustering of dense vectors.

Language:C++License:MITStargazers:0Issues:0Issues:0

fastai

The fastai deep learning library, plus lessons and tutorials

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:1Issues:0

fbmessenger

A python library to communicate with the Facebook Messenger API's

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

getting-started-python

Code samples for using Python on Google Cloud Platform

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

google-cloud-python

Google Cloud Client Library for Python

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

google-maps-services-python

Python client library for Google Maps API Web Services

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

Hangman

A simple hangman game made with python and pygame.

Stargazers:0Issues:0Issues:0

ivado-mila-dl-school-2019

IVADO/ Mila's Summer Deep Learning School

Stargazers:0Issues:0Issues:0

JupyterBrew

Demo of ArcGIS Python API and displaying Yelp reviews

Language:Jupyter NotebookLicense:GPL-3.0Stargazers:0Issues:1Issues:0

Lasers_and_Feelings_Random_Generator

A Lasers and Feelings random generator (heroes, ship, and threats) built in python

Language:PythonStargazers:0Issues:1Issues:0

mat2vec

Supplementary Materials for Tshitoyan et al. "Unsupervised word embeddings capture latent knowledge from materials science literature", Nature (2019).

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

mit-deep-learning-book-pdf

MIT Deep Learning Book in PDF format (complete and parts) by Ian Goodfellow, Yoshua Bengio and Aaron Courville

Language:JavaStargazers:0Issues:1Issues:0

neural-networks-and-deep-learning

Code samples for my book "Neural Networks and Deep Learning"

Language:PythonStargazers:0Issues:1Issues:0

Probabilistic-Programming-and-Bayesian-Methods-for-Hackers

aka "Bayesian Methods for Hackers": An introduction to Bayesian methods + probabilistic programming with a computation/understanding-first, mathematics-second point of view. All in pure Python ;)

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

PyMessager

Python API to develop chatbot on Facebook Messenger Platform

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

python-api-challenge

# Python API Homework - What's the Weather Like? ## Background Whether financial, political, or social -- data's true power lies in its ability to answer questions definitively. So let's take what you've learned about Python requests, APIs, and JSON traversals to answer a fundamental question: "What's the weather like as we approach the equator?" Now, we know what you may be thinking: _"Duh. It gets hotter..."_ But, if pressed, how would you **prove** it? ![Equator](Images/equatorsign.png) ### Before You Begin 1. Create a new repository for this project called `python-api-challenge`. **Do not add this homework to an existing repository**. 2. Clone the new repository to your computer. 3. Inside your local git repository, create a directory for both of the Python Challenges. Use folder names corresponding to the challenges: **WeatherPy**. 4. Inside the folder that you just created, add new files called `WeatherPy.ipynb` and `VacationPy.ipynb`. These will be the main scripts to run for each analysis. 5. Push the above changes to GitHub. ## Part I - WeatherPy In this example, you'll be creating a Python script to visualize the weather of 500+ cities across the world of varying distance from the equator. To accomplish this, you'll be utilizing a [simple Python library](https://pypi.python.org/pypi/citipy), the [OpenWeatherMap API](https://openweathermap.org/api), and a little common sense to create a representative model of weather across world cities. Your first objective is to build a series of scatter plots to showcase the following relationships: * Temperature (F) vs. Latitude * Humidity (%) vs. Latitude * Cloudiness (%) vs. Latitude * Wind Speed (mph) vs. Latitude After each plot add a sentence or too explaining what the code is and analyzing. Your next objective is to run linear regression on each relationship, only this time separating them into Northern Hemisphere (greater than or equal to 0 degrees latitude) and Southern Hemisphere (less than 0 degrees latitude): * Northern Hemisphere - Temperature (F) vs. Latitude * Southern Hemisphere - Temperature (F) vs. Latitude * Northern Hemisphere - Humidity (%) vs. Latitude * Southern Hemisphere - Humidity (%) vs. Latitude * Northern Hemisphere - Cloudiness (%) vs. Latitude * Southern Hemisphere - Cloudiness (%) vs. Latitude * Northern Hemisphere - Wind Speed (mph) vs. Latitude * Southern Hemisphere - Wind Speed (mph) vs. Latitude After each pair of plots explain what the linear regression is modelling such as any relationships you notice and any other analysis you may have. Your final notebook must: * Randomly select **at least** 500 unique (non-repeat) cities based on latitude and longitude. * Perform a weather check on each of the cities using a series of successive API calls. * Include a print log of each city as it's being processed with the city number and city name. * Save a CSV of all retrieved data and a PNG image for each scatter plot. ### Part II - VacationPy Now let's use your skills in working with weather data to plan future vacations. Use jupyter-gmaps and the Google Places API for this part of the assignment. * **Note:** if you having trouble displaying the maps try running `jupyter nbextension enable --py gmaps` in your environment and retry. * Create a heat map that displays the humidity for every city from the part I of the homework. ![heatmap](Images/heatmap.png) * Narrow down the DataFrame to find your ideal weather condition. For example: * A max temperature lower than 80 degrees but higher than 70. * Wind speed less than 10 mph. * Zero cloudiness. * Drop any rows that don't contain all three conditions. You want to be sure the weather is ideal. * **Note:** Feel free to adjust to your specifications but be sure to limit the number of rows returned by your API requests to a reasonable number. * Using Google Places API to find the first hotel for each city located within 5000 meters of your coordinates. * Plot the hotels on top of the humidity heatmap with each pin containing the **Hotel Name**, **City**, and **Country**. ![hotel map](Images/hotel_map.png) As final considerations: * Create a new GitHub repository for this project called `API-Challenge` (note the kebab-case). **Do not add to an existing repo** * You must complete your analysis using a Jupyter notebook. * You must use the Matplotlib or Pandas plotting libraries. * For Part I, you must include a written description of three observable trends based on the data. * You must use proper labeling of your plots, including aspects like: Plot Titles (with date of analysis) and Axes Labels. * For max intensity in the heat map, try setting it to the highest humidity found in the data set. ## Hints and Considerations * The city data you generate is based on random coordinates as well as different query times; as such, your outputs will not be an exact match to the provided starter notebook. * You may want to start this assignment by refreshing yourself on the [geographic coordinate system](http://desktop.arcgis.com/en/arcmap/10.3/guide-books/map-projections/about-geographic-coordinate-systems.htm). * Next, spend the requisite time necessary to study the OpenWeatherMap API. Based on your initial study, you should be able to answer basic questions about the API: Where do you request the API key? Which Weather API in particular will you need? What URL endpoints does it expect? What JSON structure does it respond with? Before you write a line of code, you should be aiming to have a crystal clear understanding of your intended outcome. * A starter code for Citipy has been provided. However, if you're craving an extra challenge, push yourself to learn how it works: [citipy Python library](https://pypi.python.org/pypi/citipy). Before you try to incorporate the library into your analysis, start by creating simple test cases outside your main script to confirm that you are using it correctly. Too often, when introduced to a new library, students get bogged down by the most minor of errors -- spending hours investigating their entire code -- when, in fact, a simple and focused test would have shown their basic utilization of the library was wrong from the start. Don't let this be you! * Part of our expectation in this challenge is that you will use critical thinking skills to understand how and why we're recommending the tools we are. What is Citipy for? Why would you use it in conjunction with the OpenWeatherMap API? How would you do so? * In building your script, pay attention to the cities you are using in your query pool. Are you getting coverage of the full gamut of latitudes and longitudes? Or are you simply choosing 500 cities concentrated in one region of the world? Even if you were a geographic genius, simply rattling 500 cities based on your human selection would create a biased dataset. Be thinking of how you should counter this. (Hint: Consider the full range of latitudes). * Once you have computed the linear regression for one chart, the process will be similar for all others. As a bonus, try to create a function that will create these charts based on different parameters. * Remember that each coordinate will trigger a separate call to the Google API. If you're creating your own criteria to plan your vacation, try to reduce the results in your DataFrame to 10 or fewer cities. * Lastly, remember -- this is a challenging activity. Push yourself! If you complete this task, then you can safely say that you've gained a strong mastery of the core foundations of data analytics and it will only go better from here. Good luck! ### Copyright Trilogy Education Services © 2019. All Rights Reserved.

Language:Jupyter NotebookStargazers:0Issues:1Issues:0

python-docs-samples

Code samples used on cloud.google.com

License:Apache-2.0Stargazers:0Issues:0Issues:0

pytorch-1

Tensors and Dynamic neural networks in Python with strong GPU acceleration

Language:PythonLicense:NOASSERTIONStargazers:0Issues:1Issues:0

stanford-cs-230-deep-learning

VIP cheatsheets for Stanford's CS 230 Deep Learning

License:MITStargazers:0Issues:1Issues:0

swift

Swift for TensorFlow Project Home Page

License:Apache-2.0Stargazers:0Issues:0Issues:0

tensorflow-lifetime-value

Predict customer lifetime value using AutoML Tables, or ML Engine with a TensorFlow neural network and the Lifetimes Python library.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:1Issues:0

training-data-analyst

Labs and demos for courses for GCP Training (http://cloud.google.com/training).

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:1Issues:0

yelpapi

yelpapi is a pure Python implementation of the Yelp Fusion API (aka Yelp v3 API).

Language:PythonLicense:BSD-3-ClauseStargazers:0Issues:1Issues:0