Making Mythic in MTG Arena; Estimating How Long it Takes to Reach the Top

Lucas Buccafusca • December 27, 2018

Hello all! I hope everyone is having a fantastic holiday season! With the recent Ranked update to MtG Arena there was initially a fair amount of backlash of the duration of time it takes for an average (or above average) player to grind from Bronze until Mythic. The complete breakdown is found on the Wizards (of the Coast) website here.

A brief summary is as follows. A player is in one of 5 ranks: Bronze, Silver, Gold, Platinum or Mythic. Within each rank, there are individual tiers (1-4). Each win provides progress towards a higher tier and thus a higher rank, and losses can lead to lower tiers within a rank.

When this was first announced, a Reddit post (which I encourage you reading before continuing) went fairly viral claiming that a user with a 55% win rate would take about 3 hours of grinding per day for a month for a user to Reach Mythic. Now, there have been several Twitter posts with players that have already reached Mythic (since the patch went live about 2 weeks ago).


One improvement here is that I will be taking into account rank protection (once a player hits Gold, they cannot be lowered into Silver, for example) in all the simulations. This means our value will be slightly more accurate and lower when compared to the theoretical values found in the Reddit post.

Experimentation:

To experimentally test a variety of different win-rates I’ve chosen to apply a Monte-Carlo simulation in order to estimate how long it would take to make Mythic from any given rank or tier. This MATLAB code is available on request (as I can imagine that lazy, uncommented code would not be the most entertaining article).

Testing the validity of this simulation was comparing the simulation to the theoretical values calculated in Razgorths’ Reddit post (which does NOT take into account rank protection). So I removed the Rank Protection code and obtained results within 0.1 wins of the estimated values. (That data is not included here for ease of reading, feel free to Reach out if you wish to see it). It turns out that not including the rank protection in a calculation does actually make a noticable dent in the estimated number of games.

For those that do not know what a Monte-Carlo simulation is, you can read up on its use in Risk Analysis here. To vastly oversimplify the idea, you can estimate probabilistic events by implementing a single instance under a particular probability distribution, then repeating the simulation a very large number of times.

For my purposes, I would have a probability distribution that represents a player win rate and then I would simulate how many games it would take to reach Mythic (dividing up the simulations by rank, as the gains/losses change between ranks but not between tiers). Then I repeat the simulation (resampling the random variables) for a very large number of tests (between 20,000 and 2,000,000 iterations per rank. Based on the run time of my program, I had to shorten the iterations for the more complex distributions)

I will be testing the following scenarios:

Fixed Win Percentage: 55%, 60%, 65%, 70% across all tiers and all ranks (2,000,000 iterations per rank)

Scaling Fixed Win Percentages (Each rank has a different win rate): (2,000,000 iterations per rank)
Variable 1: [80%, 75%, 60%, 55%, 55%] (Slightly Above Average Player)
Variable 2: [80%, 75%, 70%, 65%, 60%] (Above Average Player)
Variable 3: [80%, 80%, 70%, 70%, 70%] (Top Tier Pro)
Variable 4: [70%, 70%, 70%, 70%, 60%] (Cabezas Estimate)

Gaussian Distribution (Only to show why a Gaussian collapses to a Fixed Win under this Monte Carlo scheme). Mean (μ)= 65%, Standard Deviation (σ): 0.05 (200,000 iterations per rank)

Bivariate Distribution (1:1 ratio between μ1 and μ2) (20,000 iterations per rank)
μ1: 55%, μ2: 70%, σ1=σ2=0.02
μ1: 40%, μ2: 80%, σ1=σ2=0.02
μ1: 60%, μ2: 70%, σ1=σ2=0.02

Bivariate Distribution (1:2 ratio between μ1 and μ2) (20,000 iterations per rank)
μ1: 55%, μ2: 70%, σ1=σ2=0.02
μ1: 40%, μ2: 80%, σ1=σ2=0.02
μ1: 60%, μ2: 70%, σ1=σ2=0.02

The Fixed Win percentage case is the simplest probability distribution, but can provide a solid estimate for large number of games.

The Scaling Win percentage is a more accurate model, as one can imagine, any player loses percentage points (due to the better quality of opponents). Thus, I represented four different players as different variables. The first three were arbitrarily chosen as estimates. The 'Top Tier Pro' estimate is what I used to estimate an upper bound (yes, one could make the argument that they maintain an 80% winrate all the way until Diamond, but I have to stop somewhere). Lastly, I used the estimate that Jose Manuel Cabezas posted on Twitter to approximate how long it would have taken for him to make Mythic.

The Gaussian Distribution is a traditional probability distribution that is often used for a variety of estimation problems. However, for this style of Monte Carlo simulation, the Gaussian collapses to the Fixed Win rate case (Borel's Law of Large Numbers). I included an example to demonstrate this.

Lastly, I estimated a winrate as a Bivariate Normal Distribution. I feel that this is a simplification of the truth. Often when there are interviews with pros, they say each matchup has a certain odds of winning. Thus, if you took all of these and combined them with the likelihood of facing any particular deck, and added in the natural variance of the game you get a multivariate distribution. This would be an all encompassing method, but requires too much data to implement completely. So, I again created some simulated distributions. Again, the decisions were completely arbitrary. (Each of the individual variables were Gaussian). Additionally, I also implemented some simulations in which the higher win rate was scaled by 2 just as additional data points.

Results

Player Win Distribution Bronze to Silver Silver to Gold Gold to Platinum Platinum to Diamond Diamond to Mythic Total Number of Games
(rounded to nearest game)
Time assuming 6 minutes per win
(Underestimate)
Time assuming 10 Minutes per win
(Overestimate)
Fixed: 55% 14.5457 29.6161 195.4081 235.2665 235.1006 710 71 hours 118 Hours
Fixed: 60% 13.3337 24.4586 109.9470 129.9678 130.0055 408 40.8 Hours 68 Hours
Fixed: 65% 12.3132 20.8160 76.0904 89.4592 89.4412 288 28.8 Hours 48 Hours
Fixed: 70% 11.4275 18.1391 58.1302 68.1358 68.1406 220 22 Hours 36.7 Hours
Variable 1 (Slightly Above Average) 10.0000 16.0576 110.0382 235.2185 235.0722 606 60.6 Hours 101 Hours
Variable 2 (Above Average) 10.0014 16.0577 58.1256 89.4488 129.9849 304 30.4 Hours 50.7 Hours
Variable 3 (Top Tier Pro) 9.9997 14.4096 46.9929 68.1271 68.1271 208 20.8 Hours 34.7 Hours
Variable 4 (Jose Manuel Cabezas) 11.4587 18.1478 58.1245 68.1102 129.9998 286 28.6 Hours 47.7 Hours
Bivariate (1:1, 55%-70%) 12.8454 22.6487 89.8795 106.4683 106.2415 338 33.8 Hours 56.3 Hours
Bivariate (1:1, 40%-80%) 13.3795 24.6514 112.1878 132.6372 132.6163 415 41.5 Hours 69.2 Hours
Bivariate (1:1, 60%-70%) 12.3110 20.8309 76.3822 89.8358 89.6139 289 28.9 Hours 48.2 Hours
Bivariate (1:2, 55%-70%) 12.3084 20.8107 76.7075 89.8614 89.5547 289 28.9 Hours 48.2 Hours
Bivariate (1:2, 40%-80%) 12.0698 20.0778 70.4570 82.8089 82.8177 268 26.8 Hours 44.7 Hours
Bivariate (1:2, 60%-70%) 12.0099 19.8919 69.1300 81.3406 81.5545 264 26.4 Hours 44 Hours
Gaussian (65%, 0.05) 12.3120 20.8579 76.1433 89.4591 89.4436 288 28.8 Hours 48 Hours

Note: These are the average results across the trials, hence why they are not exact whole numbers.

In order to better visualize the data, I took both a severe underestimate and overestimate on the duration of time a game of Magic takes. My personal experience is that games on average take somewhere between 6 and 8 minutes. (Given, I have been playing Izzet Drakes exclusively on MtG Arena, so my personal data may be biased). This provides an upper and lower bound in the duration of time it would take to make Mythic from Bronze.

Additionally, I’ve presented the data tierwise so that anyone can see if they start in a specific tier how long it will take to go to the next one. This will prove relevant when I discuss the ‘Maintain Mythic’ cost later on.

To get a general feel for how the number of games (assuming that the duration of an individual game is comparable) compares to other digital offerings, I’ll be using a Hearthstone Calculator (which also uses a Monte Carlo approach, of sorts) to estimate fixed win rates for 55%, 60%, 65% and 70% respectively.

Win Rate in Hearthstone Average Number of Games to
reach Legend
Fixed 55% 498
Fixed 60% 320
Fixed 65% 235
Fixed 70% 185

Now, while the number of games is lower across the board for Hearthstone, once you pass the 55% win mark, they are similar in magnitude. 50-80 games or so at 6 minutes per game is approximately 5 to 8 hours longer across a season (a month). A longer grind, yes, but not totally infeasible.

What becomes important is that a strong player (65% or higher) can likely reach Mythic/Legend with about 1-2 hours of gameplay per day for a month. Those that already have made it have noted on Twitter that they played for about 4 per day. However, the key thing to keep in mind is the restrictiveness in making the top tier when compaRed to Hearthstone.

In Magic Arena you almost assuredly CANNOT make Mythic with a fixed 55% win rate without an excessive amount of grinding. Whereas in Hearthstone, any win rate above 50% can converge to Legend eventually, in Magic Arena you need to be an above average player to make it. This indicates that the Magic Arena 'Mythic' Tier is more exclusive than the Legend Tier in Hearthstone. Extrapolating from the fact that the Legend tier is occupied of the top 0.5% of Hearthstone players,

Something to take in mind is that a players rank/tier should be considered a reflection of individual performance, not necessarily ability or potential. Players who find less time to climb will inevitably end the season with lower ranks, and factors such as card availability and whether the deck is handmade or copied from an expertly crafted list, will affect the player's success.

Maintain Mythic Cost

I personally don't believe that the current ranked system has a problem for this first season (dubbed Preseason which goes until January 31st). However, one of my major concerns comes in the season-to-season transition. In the same announcement of the Ranked season was a table that showed how your rank would decay between seasons. Specifically that all Mythic players get dropped to Gold 1 once the new season begins.

As it is shown in the large table above, the bulk of the grind happens going from Platinum to Diamond and Diamond to Mythic (which should be the case, in my opinion). In general (you can use the data above to calculate explicitly), the grind is about 60% what it is to go from Bronze to Mythic. This is a sharp contrast to the current Hearthstone model.

Hearthstone implemented a patch that updated their ranked system in January 2018 that now would drop a player at most 4 ranks; so for example, any player at Legend would be loweRed to Rank 4. Using the Hearthstone Calculator cited above, it can be shown that the return grind is closer to 30% of that of going from Rank 25 to Legend. This is still non-trivial, but is far more reasonable. It rewards the top 0.5% of players in a healthier manner.

Here is where I see the first major improvement can be made:
MAKE MYTHIC PLAYERS RETURN TO MID-PLATINUM, NOT GOLD 1

Best of 1 vs. Best of 3

I will briefly discuss the BO1 and BO3 discussion that has been running rampant. While I, personally, believe that Best of 3 is a better metric of top tier players (and that Best of 1 warps the format to rewarding fast decks) and implementation of this would require a complete restructuring of the Ranked system for this format. It would become almost impossible to make Mythic tier under the current format.

Conclusions:

Overall, the new Ranked system is not at abysmal as was initially indicated. With some players having reached Mythic in under 2 weeks it reinforces the idea that making Mythic is feasible, but not necessarily obtainable for players with lower winrates.

Thank you all for reading! I always look forward to any of your feedback on Twitter.