Menu
This article looks at importing social media statistics from popular social media channels into a Google sheet, for social network analysis or social media management. If you manage a lot of different channels then you could use these techniques to set up a master view (dashboard) to display all your metrics in one place.
Facebook email: Photo browsing: Media sharing/upload: Other features (Groups, Tags, etc.) (v2.0 'preview') from Mobile Ways. AAS Facebook rating:85%. The original demo app for what could be achieved with kinetic scrolling and a responsive UI, Gravity excels here. Use the Briefing Report Template to guide your work. ©WSC V2.0 April 2020 Next Review April 2021 BSBMKG417 - Assessment Tool Page 18 of 27 Wall Street College Pty Ltd ABN No: 42 606 344 905 RTO No: 41294 CRICOS Provider No.: 03601F Melbourne: Level 4, 20 Queen St, Melbourne, VIC 3000 Phone: Email: Hobart: Level 2, 27 Elizabeth St, Hobart, TAS. Since v2.0.0 PSD files have been removed from the downloaded package. However it is available on request. However it is available on request. This is to keep the file size low and provide faster updates.
Two years ago I wrote a post about how to create a database of Statcast data using the
baseballr package for R. I, and others, have made improvements to the scrape_statcast_savant function to make is easier to automate the build.
As before, the trick is to go year by year and, at most, week by week. BaseballSavant limits the size of any query to about 40,000 rows, or one week of games.
I place all my data in a PostgreSQL database, so the code below assumes you are dumping your data in a similar set up. Of course, you can use whatever database type you choose.
First, load the following packages:
Tofu V2 0 – Upload Your Media To Facebook Quickly Delete
Note:
myDBconnections is a personal package that makes it simpler for me to connecting to my existing databases, local and remoate
Second, we load some helper functions. The first is the main function for creating the week breaks and dates for scraping game data:
Let’s step through this. The first action takes the season of interest and creates weeks of dates starting in March and through the end of November. This means you will pick uop some Spring Training games and all Postseason games. Next, it creates a grid of the weeks with start and ending dates–end dates simply being 6 days after the start date. Then we need to create a ‘safe’ version of the
scrape_statcast_savant function so that if a week doesn’t process we can capture that side effect without stopping the entire loop.
The big action comes with the
map function. Here, we are looping over each row of the date_grid , using each date as the start and end dates. For each row, the function will print a message letting you know which week is being acquired. After the function runs, it collects each weeek into a dataframe within a larger list by isolating all result objects (as opposed to errors) and then eliminating any result that contains an empty dataframe. This makes binding less problematic.
I have an additional function that I run over each season’s worth of data to add variables and ensure that all columns are consistent in class for appending to the database.
Finally, this function will automate uploading to your database:
This function established a connection to your database, removes any existing data with the same
game_year as your fresh upload, then appends the new data to the table. I do this to ensure no duplicates and a clean data set as BaseballSavant will often times update data from previous seasons.
Now that we have our functions we are ready to roll.
If you don’t have an existing database set up, I typically run the first year alone and then use the map function to handle the rest:
We can check to make sure the datbase exists and houses the data:
Now we are ready to roll. We can map over the remaining years, 2009 through 2019, using the following code:
You can see I included some additional messages to keep you sane during the process, as well as 5 minutes of sleep inbetween each season.
Tofu V2 0 – Upload Your Media To Facebook Quickly Create
The entire process can take anywhere between 70-120 minutes.
When you are done, your data should look something like this:
I also highly recommend indexing the database to make your queries run faster where possible. Here are the standard one’s I create whenever the database gets updated:
Hopefully this helps and if you have any questions, feel free to reach out.
Comments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |