Wednesday, June 28, 2017

[IndieDev] The Nitty Gritty on Save Files, Part 1

Persistence is possibly one of the largest drivers of repeated and extended interaction a game can have. RPGs persist campaign data between sessions; puzzle games persist how far you've been in the game and how well you beat each puzzle; even ye olde arcade games persisted high scores for all to see (until someone rebooted the arcade machine, anyhow). With that in mind, creating a robust save system is one of the most important tasks you could have when developing a video game. For us developing Eon Altar and now SPARK: Resistance, this is no different.

However, even the task of gathering some data, throwing it on disk, and then loading it later comes with a bunch of potential issues, caveats, and work. I'll talk today about our initial attempts at a save system in Eon Altar, why we went that route, why it didn't work, and what the eventual solution came to be.


A Rough Start

When we first created our save system, the primary goal of it was to save character data and what session the players were on. We had a secondary goal of utilizing the same system as our controller reconnect technology, as the character data was originally mirrored on the controllers as it was on the main game: down to character model and everything. We weren't originally planning on having mid-session saves (those came later), so really we only had to worry about saving between levels.

The "easiest" way of doing this, without having to think of any special logic is to copy/paste the state from the main game into the save file, as well as the controllers over the network. In programmer terms, serialize the state, and deserialize it on the other end. Given our time/budget constraints, we thought this was a pretty good idea. Turned out in practice this had some pretty gnarly problems:

  1. The first issue was simply time. The time it took to save out a file or load up a file was in the order of tens of seconds. Serializing character objects and transferring that across the network was measured in minutes, if it succeeded at all.
  2. The second was coupling to code. Since we were serializing objects directly, it meant that any changes to the code could break a save file. If we changed how the object hierarchy worked, or if some fields were deleted and others created, then existing save files would potentially be broken.
  3. The third issue was complexity. The resulting save file was an illegible, uneditable mess. Debugging a broken or corrupt save file was a near impossible task. Editing a broken save file was also quite difficult, if not impossible. Because of this, we couldn't (easily) write save upgrade code to mitigate issue 2. We'd have been locked into some code structures forever.
  4. The fourth was just far too much extraneous data. Because we were performing raw serialization, we were also getting data about textures, character models, what were supposed to be ephemeral objects, hierarchy maintenance objects, and so on
While we had a save system that did what we wanted on the tin, it was untenable. Shipping it would've relegated our small engineering department to an immense amount of time trying to fix or work around those issues. So while this approach was "simple" and "cheap" in terms of up-front engineering cost, it was the wrong solution. We went back to the drawing board.


The Reimagining

About a year after we started development, the team shrank pretty substantially. We'd lost 1/3rd of our engineering team and my time became even more contested as I became the new Lead Programmer. I had to contend with the responsibilities that came with that title, as well as continuing to deliver features and fixes.


However, I had already been noodling on the save and reconnect systems, and had a new plan. The first step was to fix the controller reconnect, which you can read more about here

Given reconnect was taking 8 minutes each time we had to reconnect a controller, it didn't take upper management much convincing that something needed to be done. And since reconnect and save were intimately connected at the time, making a convincing argument to fix save shortly after also wasn't a hard sell. So even though I had to disappear for 2 weeks to fix reconnect, and then another 2 weeks later to fix save files, I think everyone involved believes it was the correct decision.

To fix our 4 issues, it wasn't sufficient that we just be able to save and load character data in any which manner. It needed to be quick, it needed to be decoupled from the code, it needed to be easy to read/edit/maintain, and it needed to be deliberate about what it saved out.



The Reimplementing

For humans, text is easier to read over binary, and a semantic hierarchy is more legible than raw object data. So I knew pretty early on that my save file data was going to be in XML and in plaintext.

Plaintext was important. We often get asked why do not encrypt our save files, and it comes down to maintenance. Human-legible files are easier to read and easier to fix. As an indie studio with extremely limited resources, this was a higher priority for us than preventing people from cheating their save files in a local multiplayer game. If your friend is going to give themselves infinite resources and you catch him, you can dump your coke in his lap. 


Plaintext has saved our bacon multiple times: if there is a bug that is blocking our playerbase, more enterprising players have been able to repair their own save files with careful instructions from us (and a lot of WARNING caveats) until we can get around to fixing it. Also, being able to just quickly get information from broken save files without having to decrypt them.

The benefit of using XML is we could serialize to and deserialize from programmatically without any extra work on our part: tools to do so already existed. In fact, we were already using those tools to do the old save files. The difference was instead of serializing the character object instances directly, I created an intermediate set of data that was decoupled from the objects that made up the character data instances in-game, and this data was going to be organized according to game-play semantics rather than raw object hierarchy.

An example of a simple data class, and the resultant XML.
Having actual data classes meant we could lean on the compiler to ensure data types matched up, and that we could just use existing serialization tools to spit out the save data. It did mean a fair bit of manual work to determine what goes into the save file and where, but the benefits of that work more than made up for the upfront time. Adding new fields to save data is trivial, and populating new fields via upgrade code isn't terribly difficult. Editing existing save files became super easy because the save file format was now extremely legible. Legible enough that we've had users edit their own save files easily. And good news, because the data was decoupled we could actually write save upgrade code!

Collating the data into the data classes at runtime is a super speedy process. Less than 1ms on even the slowest machines. We're only serializing the simplest of objects--data classes are generally only made up of value types, other data classes, or generic Lists of other data classes or value types. And since we weren't serializing a ton of extraneous objects that only were supposed to exist at runtime, the amount of data we'd save out was significantly reduced: 29KB for a file with 2 characters, instead of multiple MBs. We put the actual writing of the save file to disk on a background thread; once we had the data collated, there was no reason to stall the main thread any longer, and disk writes are notoriously slow.

The difficult part was going from the data classes to instanced data. Previously it would get hydrated automatically because that's what deserializing does. However, in this case we hydrated data classes, I had to write a bunch of code that recreated the instanced runtime character data based on those data classes. This required a lot of combing over how we normally generated these object instances, and basically trying to "edit" a base character by programmatically adding abilities, inventory, etc. based on the save data. It wasn't particularly hard, but it was time consuming, and potentially where most of our bugs were going to lie. But by using the same methods we call when adding these things normally at runtime allowed me to reuse a lot of existing code.


Part 2: Checkpoint Saves
 
We had our new save system, and it was pretty awesome. The original save system was done in approximately a week, if my memory serves, maybe a little longer. The new system took a month to implement after research, programming, and testing. Basically, you get what you invest in. Skimping on engineering time on this feature was a bad decision in my 20/20 hindsight, but we fixed it, so all is well today!


Next blog post I'll discuss the next step we took for Eon Altar: Checkpoint saves. Why checkpoints? What did we need to do to retrofit the game to handle checkpoint saves? What implementation?  What pitfalls we ran into? And then, what can we reuse for SPARK: Resistance? #IndieDev, #EonAltar

Monday, May 8, 2017

[Alliance:HotS] Stats and Stat Relationships

Alliance: Heroes of the Spire--like many RPG-derived systems--has a number of statistic on each hero, and they're not really explained in-game. I've had the luck to chat with the developers (they're quite available, which is super cool), and have gotten a few formulas out of them instead of having to reverse-engineer everything on my own, which is fantastic. So here I'll talk about a couple of the major formulas, then talk about what those formulas mean for numeric relationships.

WARNING: SO MUCH MATH AHEAD


Aim vs. Block

Possibly the most asked about, and one of those more misunderstood. Aim and Block affect how often your debuffs land, or how often debuffs land on you. They're directly opposed. The formula is as follows:
Noting that Aim and Block are both percentages, so divide the value you see in the UI by 100.

So an example might be Caelia, who has a 50% base rate to proc a Heal Block debuff on her target with her A1. If she has 73% Aim, and the target has 25% block, the result is:


So the two are linearly opposed, but multiplicative to the base proc rate. If the base rate is low, it'll still likely be low even with oodles of Aim. i.e.: a base 20% would only be 40% with 100 Aim against 0 Block, which is twice as high but no where near a guaranteed proc, but that Aim will prevent a high Block enemy from dumping your proc rate into the toilet, as 100 Block but 0 Aim means multiplying your proc rate by 0.

Basically, if Aim and Block are close, it'll be about your base proc rate. The further apart they are, the greater the effect but the base proc rate is still the biggest factor.


Power vs. Armor

Okay, so let's do some damage. The things to remember about damage rolls is that the only "roll" that occurs is the crit roll. Aim has nothing to do with your "accuracy" in the traditional sense (only for debuffs), and the damage itself is a static number based on your stats; there's no random variation.

The damage formula is just a string of multipliers:

Each individual factor is a little more complex, but not by much.

Raw damage is simply your power multiplied by the scale factor of the ability. In cases where the ability does bonus damage, that bonus damage is generally the stat multiplied by a different scale factor. You can find scale factors on https://spirebarracks-dev.herokuapp.com/ for each hero ability, though I'm unsure how up to date it is.

For example, Akamin's A1, Magic Bolt, has a scale factor of 1, so it's simply just Power in base damage. His Spray of Flame, however, has a scale factor of 1.15, so it does more base damage.

In the case of "bonus damage" such as Otto's A1, Backhand, you have 0.2 * Power + 0.44 * Armor as the RawDamage factor.


The Armor Factor is based on your opponent's armor. All attacks are affected by this factor--unless they penetrate armor, but I'm not covering that today. The relationship ensures what is known as diminishing returns. Basically, after a certain point, each extra 1% mitigation becomes more expensive than the last.
Armor Value vs. Percentage Mitigation
For example, to get 20% mitigation, you need 260 armor. 60% mitigation, you need 1560 armor. 80% mitigation, 4160 armor. 90% mitigation, 9360 armor.

That sounds pretty excessive, but each interval I chose was half the damage taken of the previous interval. However, armor does start to lose a lot of luster after 4000ish unless you can easily net those armor points.

Finally CritFactor:
Pretty simple, don't affect the calc if you don't crit. Increase damage by your Crit Multiplier if you do crit. 


Crit% vs. CritMult%

Critical Strike Rate increases your damage, as does Critical Multiple Factor. However, the two are symbiotic. The more Crit% you have, the more you benefit from CritMult%, and vice versa. The good thing is that since this relationship is static, we can math out the optimal numbers for best performance.

The graphs compare the two look like the following:
Crit% on bottom left X axis, CritMult% on right, Z axis. Ultimate average damage multiplier on the Y axis.
That's a little hard to read, so here's a contour graph instead:


X axis is Crit%; Y axis is CritMult%. Contours from left to right are +0.2 damage multiplier
The darkest blue section represents an average damage increase of 0% - 20% over time. Then we have 20% - 40% in the mid-blue, 40% - 60% damage increase average in the blue-orange, and so on.

From this graph we can easily see that Crit% has the bigger effect on our average overall damage until we start getting close to maximum Crit%. At the 60% Crit% to 80% Crit%, we may actually be better off starting in on CritMult% (assuming you're going for damage, and not say, Witchstone, which cares naught about CritMult%).

A level 25 Weapon gives 35% Crit% or 67% CritMult%, whereas jewels are 5% each making Crit% jewels far more powerful than CritMult% to a certain point. The Crit% weapon is still more powerful than CritMult% unless you're rocking enough Crit% jewels to hit ~60% Crit without the weapon (which is 9 5% Crit% jewels. Attainable, but good luck).


Crit%/CritMult% vs. Power

This is going to be the most complex relationship, and depends entirely on what scale factors your abilities have. However, if we assume a scale factor of 1x your power, life gets a little easier. Then the amount of extra damage you do depends entirely on your percent increase to Power.

If we look at the contour plot in the section above, with the base level of CritMult, we'd need 80% Crit% to maintain a +40% damage for an A1 with a scale factor of 1, whereas a level 20 Power weapon will give you +40% damage by itself. This benefit gets even better for abilities that have better than 1x scaling for Power. Basically, if you go all in on Power, it should be numerically comparable to going all in on Crit% and CritMult%, if not better.

9 5* Power jewels is +99% Power, and Weapon and Gloves at 5* would be another 102%, meaning you'd do triple base damage, versus 9 5* Crit% jewels for +45% Crit% (60% total) and +134% Crit Mult (for a total of 184%), which only sits around double average damage.

Crit% jewels seem to be far more abundant than Power% jewels, though. I'm swimming in 5* Crit% jewels and have...0 Power% 5* jewels. Not sure they even exist. 4* give 8%, which is +72% Power. Which is still better on average than the Crit% route.

But barring Supercrits or mechanics that play off Crit% or Crits, Power seems to be the mathematically superior option here, especially at lower ends of gear. At least for average damage over time. For PvP, especially with Magitek Bards than can reset the health bars of their party every 3 turns, burst is the name of the game and Crit%/CritMult% will give you far better burst than just Power will.

Basically, Crit%/CritMult% makes your DPS swingier but a higher upper bound at low gear levels, whereas Power is good for solid, dependable DPS but not as swingy. But enough Power gems, and even Power can reach the upper bounds of what Crit%/CritMult% can manage.

And again, this goes entirely out the door if your abilities scale poorly off Power. So most tanks or multiattack abilities, like Pistoleers or Free Blades' A1. Or if you have things that proc off Crits, or you're using Supercrit (which I'm not going to do the math on today).


HP vs. Armor

Often people consider something called "Effective Health", which is a combination of factors that basically say: you have effectively this much health. For example, if you have 1000 HP, and 50% mitigation, your "effective health" is 2000.

Think of it this way: if an attack does 500 damage a shot, and you have 50% mitigation, each actually only does 250 damage, and it'll take 4 shots to kill you. Or if you have 2000 health and no mitigation, it'll take you 4 shots to kill you at 500 damage a shot if you have 2000 health. Hence an effective health of 2000.

Armor and HP tend to be diametrically opposed on gear, you either have armor, or HP, and HP generally comes in percentages (I'm ignoring raw value jewels for this), so you can directly compare how much effective health  your armor gives you versus how much health your HP gear gives you. But note that Effective Health scales off both HP and armor, so it's not a strictly 1:1 relationship.

Add to that the fact that the majority of healing is done via percentage heals, there's literally no reason to have more actual HP. Effective health is king here. What complicates this is that armor isn't a linear value. Diminishing returns makes this a lot harder to determine the relationship, and the hero's base armor will play a huge role here.


+HP% on bottom left X axis, Mitigation% on right, Z axis. Effective Health multiplier on the Y axis.

+HP% on the X Axis, Mitigation% on the Y Axis, every contour is +1x Effective Health, starting at +2x
As you can see in the plot graph, mitigation as it approaches 80% increases effective health significantly. 90% even more so. I actually had to cut it off at 80% or the graphs would be barely legible. 90% mitigation is basically 10x effective health, for example, vs. 80% mitigation which is only 5x effective health.

Which is to say, armor has a much larger effect on effective health than raw HP does. And remember that 1560 armor total is enough for 60% mitigation. But since mitigation is static, let's sub in the Armor formula for y in our contour graph:
+HP% on the X Axis, Armor on the Y Axis, every contour is +2x Effective Health
There's no easy off the cuff answer here, unfortunately. Some combination of health and armor is likely to be the best. Here's a closeup of the bottom half of the graph with a higher contour fidelity:
+HP% on the X Axis, Armor on the Y Axis, every contour is +1x Effective Health, starting at +2x
So ~2100 armor but no health is about x3 Effective Health, which is about the same as +200% HP and 0 armor. But if you could manage ~2100 Armor and +50% HP, you're looking at nearly x4.5 Effective Health.

No easy answers, but unless Rumble decides to put in attacks that do static HP in damage instead of percentages, or start converting heals from percentage to amounts that are static, your actual HP doesn't matter. It's all about the Effective Health. The one exception currently is Armor Penetration. This fact makes stacking armor penetration potentially extremely powerful against Tanks, but I haven't run the numbers yet. That's just a hunch.

Edit: There is one other thing: Armor Break. Normal is 50% armor reduction, Witchstone is 75%. For a tank with the 4160 armor for 80% mitigation, that means 2080/1040 Armor after the debuff, which amounts to 66%/50% mitigation. So basically, 70%/150% more damage taken. So HP is a buffer in case of Armor Break.


Conclusion

I don't know how speed works precisely, so that's the one stat I'm missing, but otherwise this is a pretty comprehensive mathematical look at the stats in Alliance. Power in general seems to be undervalued by the community and Crit% overvalued. Armor vs. HP has a correct optimal answer, but depends on how much armor your character can get. Crit% vs. CritMult% also has a correct optimal answer, and Aim vs. Block is pretty straightforward.

The wrinkles that get thrown in these are basically individual ability power scalars, "bonus damage" scalars like armor for some tanks, or Aim/HP/Whatever, or abilities that proc off crits, including armor sets such as Witchstone and Wartech. A lot depends on the individual hero still. And none of this takes into account buffs/debuffs.
#Theorycraft, #AllianceHotS

Sunday, April 23, 2017

[Alliance:HotS] Theorycrafting Critical DoTs

I've been playing a lot of Alliance: Heroes of the Spire the past couple of months. It's a nifty Summoners War clone made by a North American company, which has been pretty cool because we get to talk to the devs relatively often--as a dev myself, I'm pretty appreciative of that access.

So, you summon heroes, level them up, gear them up with up to 6 pieces of gear, and let them loose on other teams and dungeons. The thing about gear is you can wear x number of a type of gear to get a benefit. For example, Bone gear will increase your hero's Health by an extra 20% for every 2 pieces of Bone gear you have on. Some more powerful sets require 4 pieces, making them mutually exclusive with other 4 piece sets--such as Swiftsteel (25% chance of performing an extra ability 1 attack every time you attack), or Titanguard (Transfer 30% of damage done to your party to you, reduce incoming damage by 15%).

My Sunslash's Gear Screen, wear 4-Piece Witchstone and 2-Piece Sharpthorn
However, one of the more interesting mechanics I've found in the game is that using a specific item set--Witchstone--your buffs/debuffs can critical strike, making them either 40% ~ 66%ish more powerful depending on the buff/debuff, or undispellable if it's not a numeric buff.

A prime example of this are Damage Over Time debuffs, aka DoTs. The regular DoT always does 5% of the target's max health in damage each round, but a critical DoT does 7% per round (an increase of 40% total damage). DoTs are great at shredding most PvE bosses, as they tend to be a big target with lots of Armor (damage reduction), and DoTs ignore Armor.

Razormane, Flameclaw, and Icefang are plentiful and, against the right targets, powerful.
As such, lots of people like to use Razormane or Flameclaw to take out bosses. The benefit of these cats is that they're quite common--they're available from the worst pull you can make, and even drop from some dungeons, and their main attack has a 30% chance to apply a DoT (and can apply a second DoT if they were stealthed when they attacked).

Where this becomes really interesting is Swiftsteel (25% chance of an extra attack) vs. Witchstone (If your DoT crits, it'll be 7% DoT). Are the extra attacks better than the critical DoT? Let's do some math.

Making this all slightly more complex is the fact that debuffs may not always land. The to-hit of a debuff is Base Chance + (Aim - Enemy Block). Pretty close to everything has a base 15% to block (Bosses actually hit 25% at the highest floor you can fight them at).

So we have two possible stats for our DoT to scale from: Aim (Hit%), and Crit%. To make this easier, we'll ignore stealth for Flameclaw, and just assume spamming the first ability over and over.


Varying over Crit%

Let's assume we have a 100% chance to apply a DoT (125% Aim against a Floor 6 boss), and vary the critical strike percentage.

For Witchstone, we have an Crit%, aka x, chance to apply a 7% DoT, otherwise it's a 5% DoT:
For Swiftsteel, it never crits, so the amount of damage we can apply is based on the Swiftsteel proc rate:

Now, there's a small flaw in this math that I'm going to glaze over, which is that these expected health percentage damage values are an average over a lot of samples. In a single fight where you might get 5 - 10 turns, the variation is going to be much higher, so take this with a grain of salt. But over a lot of fights, we can work without dealing with that flaw.

In any case, since Witchstone's damage is varying over Crit%, but Swiftsteel doesn't, that suggests there should be a solid inflection point where Witchstone will generally outperform Swiftsteel for DoT damage:
So, assuming we'll always apply a DoT, Witchstone will usually outperform Swiftsteel once you reach 62.5% chance to critical strike your DoT.


Varying over Hit%

Let's assume we have a 100% chance to critical strike, and vary over Hit%, aka y. I'll ignore the Block/Aim/Base Chance portion, and work with the Hit% directly to make life a little easier.

For Witchstone, this is simply, since it always crits:

For Swiftsteel, this means we scaled both the regular attack and the 25% proc attack by Hit%:

Comparing the two, we actually find out that Witchstone simply scales faster than Swiftsteel with Hit%. The only value of y where an inflection point can exist is y = 0:

We'll see this fact crop up again when we try to vary over both Crit% and Hit%.


Varying over both Crit% and Hit%

Remember, x is Crit%, y is Hit%

Witchstone:

Swiftsteel:


Equating the two:
Almost immediately we notice we can divide the entire equation by y, removing the variable. Basically, Hit% is meaningless to how they scale relatively to each other. Which means that 62.5% Crit% is the magic number where the two become equivalent for a really basic scenario.


How Reality Completely Breaks My Model

Of course, Swiftsteel is more interesting than I've allowed for in my modeling. It can actually proc off any attack, meaning that if you use Prowl, you could end up making an extra attack, where the Witchstone build might not get anything except an undispellable Stealth buff. But on the other hand, if you're using Swiftsteel and it procs on a move that has no target, I believe it picks a target at random, so it might be a wash depending on who it targets (unless you only have one target).

This actually makes Swiftsteel significantly more valuable than that 62.5% Crit% inflection point would have you believe, as a fully skilled up cat on auto-battle will only use it's A1 ability every other round based on cooldown rotation, which if you squint kinda makes it like a 50%ish proc rate instead of 25% proc rate--if we count each A1 usage as a double chance to proc instead, which is a small fallacy but close enough for demonstrable purposes--which would actually make the inflection point 130% Crit%, which is absurd as anything above 100% is wasted (also, good luck hitting that much Crit%). It also doesn't take into account the extra initial damage that each Swiftsteel attack would grant as well, though in the case of the cats, it's usually small enough to be negligible.

But Witchstone has other benefits. For example, Sunslash, the Order cat, has an A3 that Marks all targets for 3 rounds, increasing the amount of damage anybody does to that target by 30%; 50% on a critical strike if you have Witchstone, which means a significant chunk of extra damage overall to potentially the entire enemy party, which makes Witchstone a better bet for Sunslash for overall DPS (assuming you can stick those debuffs). He'll likely have fewer DoTs, but critical DoTs will help make up that difference a little.

However, the 62.5% Crit% inflection is something to remember if we run into other DoT classes. Enough Crit%, and Witchstone will outstrip Swiftsteel's performance. But at the end of the day, it also comes down to what other abilities your hero is rocking, and what you need that hero for.

But if you're just using Flameclaw/Razormane for Boss Shredding DoT application, Swiftsteel is the way to go. #Theorycraft, #AllianceHotS

Wednesday, April 12, 2017

[WoW] Invading on a Schedule

7.2 (re)introduced the concept of Legion Invasions, and all-in-all the one that I did was enjoyable enough. A smattering of world quests in phased zones that gives the impression the Legion is legit invading was pretty cool, and the wrap-up scenario felt somewhat like the scenarios back in Mists of Pandaria, which is a huge plus for me.

However, said Legion Invasions appear once per day, at a pseudo-random time during the day, and only persist for 6 hours. Which for those of us who have a job and who need to sleep is a bit vexing. Ironically, I am too busy making video games to play them much these days. I get to play WoW about two times a week, and half of that is running a raid, so hoping that a Legion invasion is up on the day that I get to play and not raid is frustrating.

You will not be fighting the Legion today.

Which leads me to the question of scheduled gaming versus random limited-time events. Ostensibly, to make the world feel like a world, things should happen with or without your intervention. MMOs have been doing this to a basic degree since close to the dawn of time with rare spawns, holiday events, and launch events. Time-limited things that you have to be in the right place at the right time to experience.

World of Warcraft over the past few expansions hasn't really gone beyond the above-mentioned "scheduling", allowing players to largely set their own schedules in game. You can raid when you want to, run dungeons when you want to, run dailies when you want to, and so on. Sure, there's weekly or daily lockouts on a lot of content, but Blizzard doesn't dictate when during the day (or week) you must perform these activities.


Time-Limited Content

Last expansion, Blizzard introduced Timewalking dungeons. Unlike FFXIV's implementation, Blizzard originally only allowed you to do Timewalking over a single weekend for a given expansion. Folks complained--rightly so, too. Not everyone's "weekend" falls on Saturday/Sunday. Now Blizzard makes these events a whole week long, splitting the difference between gaming on Blizzard's schedule versus gaming on your own.

In 7.2, we have two major pieces of time-restricted content: Legion Invasions, and Broken Shore Buildings. The buildings are up for 3 days, then get blown to bits until the community rebuilds it (in the same place, no less, because we are not very bright defenders, apparently). 3 days is a bit short still for some schedules, but you're more likely to have at least a little while overlapping in that time period. Compare that to the 6 random contiguous hours every 24, which I've yet to ever have more than 1 overlap with my playtime in two weeks.

Now, the issue around invasions being "required" has largely been mitigated, thanks to the removal of that content from the requirements for flight. It's really just a matter of occasional optional content--extremely lucrative content, mind you--that you may hit or miss. Which isn't unlike many mobile games these days, except mobile games I can load up in less than 30 seconds, tap a few things, and deal with the limited-time content. WoW takes me over 5 minutes to load (including from Order Hall to Dalaran, assuming the game doesn't stall on the loading screen forcing me to restart the process all over again) before I can actually start doing anything, and I have to be at home, and on my computer. It takes significantly more effort for me to go after time-limited content in WoW than in a mobile game.

On the other hand, if they allowed the Legion invasions to stretch longer (say, 12 hours instead of 6), they'd probably need to make them less lucrative, as more people would be able to do them. Making them time-limited is a method of gating those resources for the vast majority of the population, and one that feels more natural than just saying, "You've already done this today, you are locked out."


Preferred Playstyle

I'll fully admit this is probably just me and my preferences, but when I play WoW, I expect largely to play on my schedule, and content that's billed as something you're expected to do semi-regularly but unavailable on my schedule rankles. It makes me feel like despite putting in the same time I was before the patch, I'm falling behind because my game time doesn't line up with rando-events. There's a fuzzy fidelity line somewhere in the "acceptable time limit to do this thing you're expected to do by the designers," that feels like it should be greater than 6 hours, but almost definitely okay in the 1 week time limit range.

But to be fair to me, WoW over the past 12.5 years has largely allowed us to play on our own schedules for the vast majority of content, so there is an expectation there built into the game that the WoW developers are intentionally breaking. We can't expect them to never try anything new--that would be far too stifling--but if they're going to break convention, they should ensure they're doing so for good reason. For me as a player, as far as I can see as an outsider, it doesn't feel like a good enough reason and made me annoyed (and again, that annoyance has been largely mitigated by the removal of the requirement to do that content).

I feel the buildings are in a strange place, because 3 days feels borderline too short, but at least I can say, "Oh, building is up, I can adjust my schedule to play an hour in 2 days", versus Legion Invasions which might be another 2 weeks before I can play another just because it doesn't line up with my play time. Versus other World Quests which are, as Ornyx put it, "largely interchangeable". Missing a couple World Quests is no big deal, because there's always like 30 more available at any given moment.

So overall, while I like the content well enough, I'm really not a fan of the current time gating on Legion Invasions. But I don't like it enough to go out of my way to drop other things in my life and screw up my IRL schedule to go do a Legion Invasion when it is up.
#WoW, #GameDesign