Spotify Doesn't Care

In case the professional artwork above isn't clear, I turned the Spotify logo into a sad face with three frowns.  I'm so original.


I'm not one to usually whine and bitch publicly but a recent customer service process I had to deal with was particularly horrendous so I think it deserves to be known.  Ideally, someone at Spotify can review the case as a whole, rather than separate support staff members viewing it individually.  Hopefully it's viewed as constructive criticism and they improve their current processes towards the goal of providing a much better customer service experience.  Spotify likely concentrates on supporting current customers, but also helping people that aren't active customers who could potentially even be prospective customers should also be high on their priority list.


I noticed that someone had used my credit card number to open two Spotify accounts about a month ago.  After disputing the charges with the bank (which was extremely easy to do by comparison), I went to contact Spotify to let them know about these fraudulent charges so someone wasn't continuing to get free music and bandwidth that they could use on a rightful paying customer.  You'd think this would be something that they would care about and want to fix immediately.


So I did what any person in the 21st century would do, I went to their website to find a customer service phone number or email address I could contact to speak with someone and get this handled in 10-15 minutes.



Only problem is there isn't a number or email address clearly listed.  Keep in mind that Spotify wants to be a world class corporation, some recent numbers I've heard are that they're currently valued around $5 billion and have 40 million customers, of which 10 million and paying customers.  Pretty impressive growth for a company that fills the a role once dominated by free services (replacing empty silence with music, as in the radio) and now which has many different competitors offering up similar services for similar prices.  You'd think that one way they would want to stand out from the different options would be excellent customer service.



Once again, no clear answer as to how I'm supposed to talk to someone about this.  There is a reference to their customer support Twitter handle though, @SpotifyCares, hence the title of this post.  It just sucks that instead of having a private conversation with somebody about my problem, I have to broadcast it across the entire internet (since the entire internet is my 56 followers).  Plus what I don't get is that you'd think a company would want to keep this private as well but, once again, you're wrong.



Don't be fooled by highlighted link for a contact form, you have to be an existing Spotify customer or sign up for an account to contact them.  I guess this is going public and that is how we're forced to handle things.  My apologies to the 56 people who quickly scrolled past my tweets during this whole ordeal.



Here we learn that there is indeed no phone number to call to talk to a human being.  Think about that for a second, a company supposedly worth $5 billion and has 10 million people that pay them every month does not currently offer telephone support.  I mean I get running lean and optimizing your resource's time is important, but sometimes the most efficient way is talking.  For both parties involved, the customer and the provider.  


Even worse, the link they sent me to use to contact them instead of by phone was the same one that REQUIRES you to have a Spotify account.  And if you're not a current customer, well damn, you're shit out of luck, you have to sign up.  I think it was at this point in the "What the fuck?!?" stage of the the whole process that I reached peak frustration levels.  



Out of principle and because it's the dumbest thing I've ever heard of, signing up to hopefully quickly solve this problem was out of the question for me.  They have my email, they can contact me to fix this.  I think I still had a sliver of hope they would do the right thing and make this as easy as possible for the customer at this point.


Nope.



It was at this point that a friend chimed in, equally confused and surprised that a company that wants to portray themselves as a world class organization is failing at the basic of customer support.  See, one other person knows about this shitty scenario, this is going viral! [Insert internet sarcasm emoticon] Only after tagging their Twitter handle again and making up the completely unoriginal hashtag #spotifydoesntcare did they reach out again.  Unfortunately, it was via Twitter again and they followed me and wanted me to send them a DM to give them my email address to have someone contact me.  Meaning they didn't even read my earlier tweet giving my email address.


Again as a reminder in this point of the bitching to remind everyone that this could've been solved in 5-10 minutes of talking.  I just like pointing that out.



Following this is another Twitter exchange in public view instead of privately.



Finally a fucking link to a contact form! Maybe that is their plan all along, to make a prospective customer so frustrated that after 3 days from initiating communication you're provided with a way to accomplish a really simple goal and you're very happy.  They make you hate them before answering your first question. 



I don't know if "V" and "Z" ever talk to each other, but I'm going to do all of their jobs for them by posting the link to this hidden contact form that for some unknown reason wasn't clearly marked on their website.


https://www.spotify.com/us/about-us/contact/contact-spotify-account/


So they email me back, meaning finally I can have a "private" (Hi CIA!) conversation.  Ahhhhh, only thing is the email just says that they received my contact form.  Well hopefully they get their shit in order and figure this all out.  I mean, there's no way that this will continue to drag on, right? 


Yep.



Two more days go by with no answer, so my politeness is wearing thin but I reach out via Twitter again to get their attention.  "T" tells me she just assigned my case to someone, something "V" or "Z" should've done 5 days ago.  What the "F"!



Finally it looks like someone is giving me actionable items to get the ball rolling.  "J" is by far my favorite customer support rep thus far in the process.  Another problem arises as getting this BAC number from the bank isn't very easy either, but is handled in 15-20 minutes.



I guess "C" isn't too bad of a support rep either, he probably would be number 1 in my book but it took him 6 days to get back to me. So I'll put him in a tie for first.  Again I want to point out that this could've been solved in 5-10 minutes of talking.  At this point in the story we're 12 days into the process.



What the fuck "C" I thought we were boys??? My name is in every email, just look at it when you're referencing a prospective customer.  Fine, you're definitely not tied for first anymore in the "Spotify Customer Support Staff" rankings.  But finally this horrible, horrible customer service experience is resolved.  


I don't think I need to remind anyone of how long it should've taken to fix this problem in my eyes, but in the end it took over 14 days.  I hope someone at Spotify reads this and sees how bad they look and fixes their support system.  I mean I was never really considering signing up for a Spotify account but I for sure will never consider it in the future.  And maybe the 3-4 people that read this post will second guess it themselves.

Cover Me Gently

One thing I also will post a bunch, besides talk about football or solar, is music.  Particularly electronic music.  I used to post over at loudmusicallday.com with some friends but that kind of died off and I discover too much good music to not share.


Below are some good covers I've found recently:


Thief - Cry Me a River


Cassie Steele - Sex and Candy


Animal-Music - Mr. Brightside

Reviewing the Top 5 Mocked Picks

Digging a little deeper into aggregated mock drafts, the great resource Walter Football has a very comprehensive mock draft database.  They have gone through the trouble of tabulating the top 5 picks of each mock in one big table.  That table looked something like this:



So I quickly tabulated the top 5 results from the mock draft database, only looked at mocks in 2014 and graphed their evolution over the last 4 months of the 2014 NFL Draft process.  This ended up being 428 mocks for each pick in the top 5 from what is likely the largest mock draft database on the web.  The graph is "Month" on the X axis, January through May and "Average pick" on the Y axis for some of the top prospects.  So Jadeveon Clowney having an average close to 1 in May means, wait for it, that the "consensus" pick he would be selected was #1.



With this graph it is easy to visualize the consensus order of each prospect over time.  Some interesting insights I'd love to know more about:

  • Did mock drafters become more comfortable with Clowney to the Texans over time or is that a reflection of just what they were hearing?
  • What is responsible for the separation between Blake Bortles and Clowney starting in March?  They're pretty close and then Clowney clearly separates himself from Bortles and the pack to be the clear #1 pick.  Combine or pro day?
  • Did the Jake Matthews get worse in March or did people only realize then that Glen Robinson was a better OT?
  • Who was ultimately responsible for leading the Johnny Manziel hype train? The media, the fans or was it all smokescreens from teams?


Thoughts on the Solar Ambassador program by SolarCity

"Luck is what happens when preparation meets opportunity" - Seneca



Cliche, I know.  But even behind overused, corny phrases lies some truth.  What might look like luck after the fact was likely only made possible because of many hours spent trying to better yourself in the past.  It's good to keep that in perspective because only by taking advantage of those opportunities can you create your own luck.

Full disclosure: I've worked for a solar energy provider called SolarCity for the past 4 years.  I've seen it grow from a small Silicon Valley startup to what is now a publicly traded behemoth.  I love the company and it has been a great ride so far.  I recognize this opportunity to be a part of a potentially special company from its roots is a once in a lifetime event and I don't want to take it for granted.  It has been great for me professionally too as I've had the chance to work with a lot of truly great people and feel like I have more impact than I would at a different company.

Anyways I have spent a lot of time at work and outside of it thinking about ways to help spread solar adoption, for SolarCity's benefit as well as the environment's benefit too.  One of the themes that consistently stuck out to me was enabling solar adoption to spread virally by giving more power to the public in the process.  Our old referral process involved a referrer giving someone's name and contact information for one of our salespeople to follow up with.  This was beneficial because it allowed just about anyone to get paid for recommending solar energy systems for others but I felt it didn't go far enough  to really become a monster and spread by itself without additional SolarCity resources spent on it.  

About a year ago I presented an idea to a lunchtime gathering of coworkers that was intended to be a forum to share thoughts.  This idea was to make the referral process itself more viral by making the act of sharing quicker by cutting out the salesperson from the process and allow anybody to create very basic sales proposals for anyone else themselves.  I believed that this would encourage sharing because the referrer would assume personal responsibility over spreading the message, and that our current process was too passive.  My thoughts as to the medium to use to spread it was a social media or mobile game that would simplify sharing by making the referral process more fun.  People could set up networks of additional referrers and be rewarded when they referred others.  Who wouldn't want to kill a couple minutes in line or waiting on something as well as potentially make a couple hundred bucks by promoting the spread of clean energy?

Slide explains why a distributed network of referrers is better

Slide that explains why decreasing the amount of time through the viral loop (time to share a proposal) is important

After vetting the idea with Operations leaders, I then met with the heads of our Marketing and Sales departments and presented it to them.  By this time I had done some more prep and had found viral coefficient metrics for our referral program that previously weren't known.  Basically we would improve on the virality of the program by really encouraging not only people to refer others to get a solar energy system installed but also to get those who you refer to   get more people to refer.  We would ride the inherent viral aspects of multi-level marketing to help grow the program.  They were initially intrigued but my communication with them fell off when I moved away from our HQ to work regionally.  

Slide that explains why the referral process in the early market of solar adoption doesn't work in the mainstream market of adoption

Slide that explains how we could "cross the chasm" to become an adopted technology

Recently though we rolled out a program called "Solar Ambassadors" that will allow anyone to build a referral network of up to three levels (someone you tell, someone that person tells and then someone else that THAT person tells) and be compensated if any of those referrers lead to solar energy customers.  I'm happy to see that principles of the idea are still alive and well and that it will be given a chance to prove its worth.  Some very smart people have been working on this for a long time it looks like and I think it has the makings of being a successful way to grow and acquire customers.  Even if it's not exactly the way I would've done it, I think the basic tenants of improving the viral coefficient of the referral program are still there.  This program allows for you to help promote the adoption of a technology that will change the world for the better while saving people money immediately. 

I'm sharing this story because I am proud to see something I worked hard on actually fleshed out and live, if it does turn out to be a big success I'll be able to take some small bit of credit for doing my part in the beginning, whether directly or indirectly.  

: )

//Edit, I guess employees aren't eligible for the Ambassadors program, but I encourage everyone else to join

Reviewing the Draft Order Prediction

So I quickly compared how my simple prediction fared against the 2014 mocks I collected and I will add the comparison numbers and rankings onto this post later today.  But I think it turned out pretty well, I believe by comparing the predicted and mocked picks to the actual selection order, I had an R-square value of 0.57 and the highest mock I looked at was Mayock's who had a R-square of 0.55.

What I find particularly impressive about this prediction is how simple it was -- all it looked at was a Top 100 composite ranking, height, weight, arm length and average mock draft position.  It  outperformed all the expert's mock accuracy and I didn't spend months working on it and I had no inside knowledge, I just aggregated what others were hearing and thought.


To review, my selection prediction for the 2013 Draft had a R-square value of 0.61 and the next highest was Todd McShay at 0.5.  My selection prediction for the 2014 Draft had a R-square value of 0.57 and the next highest of the ones I looked at was Mike Mayock at 0.55.  I could have looked at more mocks to be more comprehensive, some mocks might have performed higher than those that were looked at but if they were included in the dataset, the prediction would have improved as well.

Predicted 2014 Draft selection order

Below is the predicted order based off of the following variables:


Top 100 consensus ranking
Height
Weight
Arm Length
Avg Mock Draft Points


Order Player Pos Top100 Height Weight Arms Avg Mock Draft points Pred Mock Draft Points
1 Jadeveon Clowney DE 1 77 266 35 2911.111111 2656.180654
2 Greg Robinson OT 3 77 332 35 2600 2414.823449
3 Khalil Mack OLB 2 75 251 33 2188.888889 1990.70493
4 Sammy Watkins WR 4 73 211 32 1833.333333 1646.896388
5 Taylor Lewan OT 10 79 309 34 1494.444444 1581.072755
6 Mike Evans WR 6 77 231 35 1450 1566.586301
7 Jake Matthews OT 5 77 308 33 1572.222222 1558.98738
8 Johnny Manziel QB 11 72 207 31 1572.222222 1386.016481
9 Anthony Barr OLB 13 77 255 34 1094.444444 1250.329576
10 Eric Ebron TE 12 76 250 33 1175 1247.941459
11 Aaron Donald DT 8 73 285 33 1215.555556 1219.553732
12 Zack Martin OT 15 76 308 33 1144.444444 1216.86755
13 Justin Gilbert CB 16 72 202 33 1188.888889 1188.137736
14 Blake Bortles QB 14 77 232 33 1044.444444 1170.458139
15 Cyrus Kouandjio OT 34 79 322 36 797 1139.066329
16 C.J. Mosley ILB 9 74 234 33 1058.333333 1126.560729
17 Odell Beckham Jr. WR 18 71 198 33 1022.222222 1044.672079
18 Ha Ha Clinton-Dix FS 18 73 208 32 990 1011.699527
19 Kyle Fuller CB 27 72 190 33 950 1006.921183
20 Morgan Moses OT 36 78 314 35 695.8333333 1000.100097
21 Timmy Jernigan DT 30 74 299 32 925 966.0854087
22 Ja'Wuan James OT 57 78 311 35 660 965.7691372
23 Ra'Shede Hageman DT 24 78 310 34 640 918.4423979
24 Jimmie Ward SS 35 71 193 31 950 895.0107152
25 Calvin Pryor FS 22 71 207 31 927.7777778 881.5884295
26 Jace Amaro TE 39 77 265 34 606.6666667 874.7488037
27 Darqueze Dennard CB 19 71 199 30 975.5555556 874.4638941
28 Kony Ealy DE 26 76 273 34 620 869.9957469
29 Joel Bitonio OT 46 76 302 34 597.5 842.0860553
30 Ryan Shazier OLB 21 73 237 32 766.1111111 839.6863546
31 Derek Carr QB 33 74 214 32 743.3333333 839.5957556
32 Dee Ford DE 31 74 252 33 680 833.3038955

This was published at 1:30 am MST on 5/8/14


Thoughts on trying to predict the NFL Draft

Although analyzing the relationships between variables of NFL draft prospects was actually very interesting and I could've continued digging deeper forever, ultimately I needed to actually move on and try to create a prediction formula for the draft selections.  

My first attempts were to try to predict the aforementioned draft value of each prospect given their physical measurements, top 100 ranking composite score, aggregated mock draft scores and how many visits they made.  It actually did pretty well in terms of accuracy I mentioned in my earlier post comparing the NFL pundit predictions.  I do want to reiterate that I graded them on pick selection accuracy, not team based accuracy, so it is more of a reflection of what pick number the prospect should be than where they should go.  It's just a quick way to make a general suggestion of what should happen.  

Before I looked pundits' predicted vs. actual JJ Draft Value plots and ranked them by the amount of the actual draft value an algorithm could predict given their previous mock drafts results.  Here is the prediction formula's plot, it performed well at a R-square value of XXX.  This would place it above the highest ranking expert I looked at, Todd McShay had a R-square value of 0.76.  Not bad for someone who has no experience scouting and no inside NFL knowledge.


Anyways I graded the prediction formula against the pundits a little differently as well.  This time instead of using JJ Draft Value as a metric, I'll use pick selection for the 2013 Draft.  So I compared the prediction formula's projected pick to the expert's picks, here are the experts picks:


You can see that Mel Kiper did much better in 2013 than he has done in his career, but Todd McShay is still doing better.  By the way I don't mean to pick on Mel or anything, he's just the most recognizable NFL Draft expert.  Anyways, below are my picks versus the actual selections in the 2013 Draft:


What I think this shows is that if you use even a very basic prediction algorithm of what pick a prospect is going to be selected at, rather than trying to predict what team will select them, you can get a better sense of a prospect's draft value than even some of the experts in the field.  

I would love to see people take ideas from this project and expand on the research, I'm sure it would be valuable to any NFL front office.  I will post my data after the project is complete.  In the next post, I will attempt to predict the 2014 draft order.



Exploratory Draft Data: Top 100 rankings

I deviated from the last topic I was going to dive into at the outset of the project.  Instead of looking at the college statistics of each individual I thought it would be more prudent to study the overall rankings of each player.  College stats wouldn't be the best predictors because they're heavily dependent on the scheme and caliber of teammates each prospect had, whereas overall rankings project them compared to other prospects at their respective positions.  I think there will be some crossover effects with the mock draft rankings, but mock drafts project fit with teams and aren't true ordered rankings.


To create a composite style ranking similar to this one, I collected the top 100 rankings of 10 different draft pundits, averaged out their rankings to get a consensus and then ranked them.  Below are the top 50 results from 2013, since my test draft prediction will be run on that draft.


Pundits used:


DraftTek
Mike Mayock
NE Draft
Walter Football
Blogging the Boys
Gil Brandt
New Era Scouting
NFL Draft Scout
Matt Miller
Daniel Jeremiah



Player College Rank Avg
Luke Joeckel Texas A&M 1 1.8
Eric Fisher Central Michigan 2 3.1
Dion Jordan Oregon 3 4.6
Sharrif Floyd Florida 4 6.9
Chance Warmack Alabama 5 7.2
Star Lotulelei Utah 6 8.1
Lane Johnson Oklahoma 7 8.8
Dee Milliner Alabama 8 9.2
Ezekiel Ansah Brigham Young 9 10.6
Jonathan Cooper North Carolina 10 11.1
Sheldon Richardson Missouri 11 11.2
Tavon Austin West Virginia 12 13.22
Barkevious Mingo LSU 13 13.8
Kenny Vaccaro Texas 14 15
Bjoern Werner Florida State 15 17.9
Geno Smith West Virginia 16 19.4
Jarvis Jones Georgia 17 20.8
Tyler Eifert Notre Dame 18 21.2
Xavier Rhodes Florida State 19 21.4
Cordarrelle Patterson Tennessee 20 21.4
Sylvester Williams North Carolina 21 26.7
Alec Ogletree Georgia 22 27.4
D.J. Fluker Alabama 23 27.5
Cornellius Carradine Florida State 24 28.7
Desmond Trufant Washington 25 29.2
Jonathan Cyprien Florida Int'l 26 30.9
Keenan Allen California 27 31.7
Datone Jones UCLA 28 32.2
Manti Te'o Notre Dame 29 33.3
DeAndre Hopkins Clemson 30 33.5
Arthur Brown Kansas State 31 34.1
Damontre Moore Texas A&M 32 36.9
Eddie Lacy Alabama 33 37.11
Matt Elam Florida 34 38.11
Johnthan Banks Mississippi State 35 38.25
Kevin Minter LSU 36 39.3
Kawann Short Purdue 37 40.4
D.J. Hayden Houston 38 40.9
Jamar Taylor Boise State 39 42.1
Robert Woods USC 40 42.3
Matt Barkley USC 41 42.88
Jesse Williams Alabama 42 43.2
Eric Reid LSU 43 45.1
Zach Ertz Stanford 44 45.5
Menelik Watson Florida State 45 45.71
Justin Hunter Tennessee 46 45.8
Alex Okafor Texas 47 46.71
Johnathan Hankins Ohio State 48 49.33
Larry Warford Kentucky 49 51.22
Kyle Long Oregon 50 51.44

Exploratory Draft Data: Evaluating team visits

Another aspect I wanted to look at was if whether or not a prospect visited a team during Draft season influenced if that team eventually selected said player.  To do this, I found Walter Football's team visit list very helpful and tabulated the results for 2013.  I hope to do the same for previous years as well to improve on the accuracy.


Here's a breakdown of what teams brought in what positions (and was reported and collected by Walter Football) for visits or were confirmed to have spoken with them at gatherings.  Again this is just what was reported so that's likely the reason for the disparities in numbers, most teams probably bring in about the same number each year.

The columns were conditionally formatted, with the dark green values indicating that team worked out that position more than other teams worked out the position in the same column, same with the grand totals at the end:

There's some pretty clear indicators in there.  Some I can think of:  New England and Philadelphia like to do their homework, Atlanta scouted a bunch of DB (and picked one 1st round), Buffalo scouted a bunch of QB (and surprise picked one 1st round) and DB and DL were met with by almost every team.


Here's whether or not each team's 1st round pick visited or not (for the record, 61% or 19 of the 31 known prospects did):

One guess as to why more teams didn't bring as many prospects in that they ended up selecting towards the end of the round could be that those team's picked more from the "best player available" methodology and ended up with people they didn't initially believe they would have the opportunity to draft.


Exploratory Draft Data: Comparing pundits mock drafts

Like I have mentioned previously, the NFL Draft is now a major industry that exists within the massive industry that of professional football.  It is a field in which celebrities exist but is accessible to all, people with a lifetime of experience can give their views next to someone who knows little about the subject, and you can even change your ideas as much as you want as the draft season of February to now May mvoes along.  This has both its advantages and disadvantages in predicting where a prospect will be drafted.


What you want to do ultimately do to use the advantages and reduce the disadvantages is aggregate mock draft predictions to reduce the reliance on any one person's judgment.  This new mock ranking should be a combination of both what pick the prospect is predicted to be across the board but also what teams the pundits think are good fits, but for right now is just the draft value points.  This should help quantify the very qualitative process of what a lot of people feel makes sense in terms of fit, which is important as well.


If someone were to expand on this start, I'd suggest they get a much broader array of pundits, I only had time to collect a couple in time to complete this project.  There are literally thousands of people willing to give their opinions of where they think prospects should be drafted.


One thing I do want to note is that in order to numerically compare what pick these prospects should be drafted I used the method that is generally accepted as the standard draft pick value, the Jimmy Johnson Draft Value Chart.  It has been used since the 1990s to come up with a way to numerically compare draft pick trades between teams, so it more accurately describes draft value than just a number slot in my opinion.  I decided to use what the NFL ultimately uses, since I want the prediction to be as accurate as possible.  Incorporating a truer draft value chart would make sense if one were available, so until then the old coach of my Miami Hurricanes will continue to be the way the game is defined.



Comparing mock drafts based on draft pick value:


I collected the final mock predictions of the following draft pundits for any years I could between 2008 and 2013 within my limited timeframe

Some had 2 years worth of data, some had 5 years.  Obviously the more years worth of data, the more accurate the evaluation of the pundit would be but this is what I could collect.  If expanded, I would collect as many years back that I could from many sources.

To compare their accuracy, I fit a linear regression line and am comparing their R-square values.  What this essentially tells me is what percentage of the prospects 1st round draft value you could get right with just each pundit's prediction.  So if Eric Fisher is worth 3000 points as the first pick and Matt Elam is 590 points as the last pick, how close could I come just using the pundit's corresponding draft value prediction as the only variable considered.


Here is one full example, that of the popular Draft godfather himself, Mel Kiper:

This is both a good visual as well as numerical representation of the pundit's accuracy.  The circled blue value Mel guessed around 2200, but the prospect was really "worth" only about 1500.  So Mel overestimated this prospects draft position this particular year. The red line is the linear regression fit line and would go from the bottom left corner to the top right diagonally in an ideal world.  The further this line is off visually indicates how off the accuracy is.  Also the correlation value on the bottom lets me know numerically how closely associated an increase in Mel's predicted draft value is with the prospect's actual draft value.


Here is the full list of pundits, ordered from most accurate to least, along with how many observations I collected of each:

These aren't perfect comparisons because I couldn't find every pundit's mock for each year of every other one (although if I wanted to just compare 2013 I probably could to accurately rank them as of last year).  But it is more for overall accuracy generalizations.  Really what this says is that Todd McShay is better at predicting a prospect's eventual draft value based on JJ's chart better than his contemporary at ESPN, the original draft don, Mel Kiper.