When you go into the privacy panel of your account settings, you can turn off sharing your Facebook data with Spotify and personalized advertisements. You can also download your streaming history from the last 90 days. It took about a week for Spotify to collect the data and notify me in an e-mail. The data you receive is a couple of JSON files.
I have only used the StreamingHistory.json file. It contains the DateTime of the song played and the length you listed to it in milliseconds. I filtered out the songs that played for less than 2 minutes to only keep the songs I liked. I created a calculated column that takes you to the given song’s spotify page where you can play it.
My list contains the Marvelettes song that Fury recommended too!
A couple of months ago I had been reading twitter while eating my lunch at my workplace in Budapest when I saw a tweet informing about Alberto Ferrari going to Prague to hold a training day about data modeling in Power BI, SSAS Tabular and Excel.
I want to thank the organizers of the SQLSaturdayPrague event held at 2018.12.07-08 (with the pre-cons included), they were friendly and attentive. The event was thorougly organized and managed. The sponsors and the fellow attendees deserve a big Thank You (Děkuji vám!) for the event’s friendly, fun atmosphere.
I was a bit nervous before the pre-con. I was afraid of being late so I visited the venue the evening before the event and set-up 3 different alarms to wake me up in time. The event started at about 9 AM and finished about at 5 PM in the beautiful building of the czech savings bank.
It was great to meet Alberto Ferrari in person after reading the books and blogs authored by him and Marco Russo. As Mr. Ferrari said, building a good data model comes first because when you have a well-designed model, you don’t have to write so difficult DAX measures. We talked about different data modeling scenarios and the dangers of using the bi-directional relationship feature of Power BI. I really recommend attending a course taught by Mr. Ferrari, even if you have to travel a bit to get to the venue. Try to finish reading the Definitive Guide to DAX by 2019 february becasue that’s when the 2nd edition is going to be published!
We had three different presentations at the same time so everyone had to choose 7 presentations from the available 21.
Administrating SSRS/PBIRS was a boring task! Sorry… did you said WAS?
The main message was that implementing even a single change is quite slow using the UI of SSIRS/PBIRS. Imagine how much time you can save by doing those using powershell!
Inside the VertiPaq engine
I thought I knew this topic very well because of my relevant MCP exams (70-768) but there were some new elements such as hash encoding and materialization. I am going to research these. If you have resources about a deep dive into the VertiPaq engine, please let me know on LinkedIn.
This presentation got a strong applause from the audience because the the solution is easy to implement and very useful. You can use dynamic management views and system views to analyze the logs to check the usage and performance of your reports.
Automating the testing of your BI solutions with NBi. Everyone knows how important testing is. With the open-source NBi
you can create and store tests for different tools such as SSIS and SSAS.
Dbatools – The Swiss army knife for the DBA. Marcos also got a great applause because his presentation was very useful for anyone managing databases. Powershell is awesome, dbatool is a Powershell library if I am not mistaken.
Well, modifying a date model already implemented in a database or DWH is one thing that I have experience in but I never suspected that data modeling is this complicated. Life is complicated, the business processes of banks and other large companies are particulary complicated and databases are trying to store events in a way to be more easily queried so it is no surprise that a lot of data models are more complicated than a starchema.
Using T-SQL is the first programming language that you should master if you want to be a data scientist or PowerBI ninja – whichever is cooler. There is an ocean of beginner level study sources, try to avoid the worthless kind. Try Paul Scotchford’s classes on Udemy first.
I prepared for the exam by reading the Exam ref book and practicing with the practice test from the Microsoft Press Store.
The book is great but also very dry so I recommend reading it multiple times. Read it first quickly, not getting bogged down by very complicated topics. I had soon realized that I can learn much better by wathing videos. I had enrolled in one of the Edx courses about SQL. After enrolling I immediately downloaded all the videos and subtitles from the site for offline viewing. I had found good explanation videos on other platforms such as YouTube. After watching videos about every topic of the exam,
reading and understanding the exam ref book for a second time was much easier.
The exam was very difficult. There were some questions that were tricky or just not clear for me. For some questions you are required to type in the code which is a lot more complicated than choosing the right one from multiple solutions. I had used all the 120 minutes to answer the questions. I recommend practicing test taking by either using practice tests or learning languages with the Duolingo app. When you have 8 different possible answers you have to scan them fast for obvious and not so obvious errors. You are given a laminated pad that you can use to take notes. You can write down the letters of the possible answers and cross them one by one.
The book is not for beginners but otherwise very good, I plan to read it once in every three months because it helps you use T-SQL more effectively.
I have a Power BI report that I want to share with you.
The idea came after I watched two or more movies such as Lady Bird and The Lobster that I did not understand completely. Usually I just search for other people’s interpretations of the movie on Reddit, Twitter, Rotten Tomatoes.
My report takes a parameter value as the value of the Title column and after a refresh you can look for the particular text in multiple websites. You just click on the URL icon under the column headers and it takes you to that website’s search results.
There is a good chance that I am not the first developer coming up with this simple idea but I have never seen any blog posts using parameters this way.
How to do it?
The key idea is to replace the value of the original text in the Title column with a parameter’s value. Yes, first create a parameter named Title.Then you create a custom column replacing the search keyword with the Title column. This way you can create new calculated columns for yourself, try addig an Amazon.com column!
In the Edit Queries menu, click on Edit parameters to change the Parameter from the one I used in the report. Click on Refresh to create the new URL adresses!
You can download the PBIX file and try it yourself!
I have created a data model and a report in Power BI connecting to the API. Unfortunately, it looks like an abandoned API because the last film its data describes is the Force Awakens. Creating a report connecting to the API is a good way to practice your data modeling and data wrangling skills.