This is a site for discussing roleplaying games. Have fun doing so, but there is one major rule: do not discuss political issues that aren't directly and uniquely related to the subject of the thread and about gaming. While this site is dedicated to free speech, the following will not be tolerated: devolving a thread into unrelated political discussion, sockpuppeting (using multiple and/or bogus accounts), disrupting topics without contributing to them, and posting images that could get someone fired in the workplace (an external link is OK, but clearly mark it as Not Safe For Work, or NSFW). If you receive a warning, please take it seriously and either move on to another topic or steer the discussion back to its original RPG-related theme.

3e/3.5: Historical curiosity concerning the math.

Started by beejazz, January 24, 2013, 01:39:09 PM

Previous topic - Next topic

beejazz

I'm not sure if this out to go in General or Design and Development. Tangent from the Monte Cook interview thread:

Quote from: JeremyR;621102Though that got me eventually was the higher level stuff. I'm not sure that got tested much, everything just bogged down because so much inflation in numbers.

Quote from: beejazz;621113High levels were just absurd. I don't know what they were thinking with some of their decisions on how things scale with level. But I don't wonder why 4e fans fetishize the particular math of 4e, especially if they started with 3x.

Quote from: Bloody Stupid Johnson;621156Start a new thread if you like... A lot of it is simply converted from 2E of course, with side effects from the greater codification of various things (like monsters now having ability scores). Save progressions are built with good/poor saves corresponding to progression of primary/secondary spell levels (a Good save is +1 per 2 levels because a wizard gets spells a level higher each 2 levels, putting the DC up 1) according to Dragon magazine - they just didn't factor in either resistance bonuses or level-based stat increases, or multiclassing. Lack of playtesting at higher levels isn't too unsurprising if you consider how long it took to get to name level in older D&D, either.

So color me curious: Where did some of these decisions come from? Were some items poorly tested legacy items that fared poorly in their new context? Were some new items that weren't fully sanity tested? Where should I go if I'm looking for accounts of why the designers did this and that?

I've already got something of a clear idea of what I like and dislike mechanically in my edition of choice (3e), but I've got some historical curiosity on what prompted these decisions and how some of them slipped through.

The stand-out examples would include the widening gap in saving throws and attack bonuses, why iterative attacks work the way they do, that sort of thing.

Garnfellow

This is a good question. No edition of D&D has ever done high level all that well right out of the box, not even 4e's vaunted "fixed math," which was supposed to scale all the way up to level 30.

I think the simple, general answer is that when you're designing a new system, you spend a lot of effort just getting the basic mechanics right. Why bother focusing on 9th level spells if you're still grappling with fundamentals like, "how should saving throws work"? So consequently, a lot of effort goes into fixing the lower and mid levels, and testing high level play inevitably become an afterthought.

Also, at lower level play, there are a lot fewer potential combinations of abilities, powers, magic items, and spells, whereas the higher levels all sorts of strange things can happen, and it's hard to anticipate these interactions without doing a shit-ton of playtesting. Like, years.

For all editions prior to 3e, the power curve basically plateaus at around name level (say 8th-9th level). Wizards and clerics continue to get new and higher level spells, but almost all other aspects either stop or significantly slow down. There's a lot of interesting talk in the OSR about D&D's "end game," and what it was supposed to look like. An influential theory is that, once PCs hit name level, the game was intended to shift from hexcrawling and dungeon delving to castle building, arcane research, and so on. Hence the plateau.
 

Garnfellow

Specific to 3e: In the 3e, they dropped the idea of a separate end-game and basically re-engineered the math so that the power curve from levels 1-10 extends smoothly to levels 11-20. So lower-level 3e plays a lot like lower level 1st or 2nd edition, but high level play looks pretty different.

Extending the power curve has a few unexpected side effects. The spread between capabilities (AC, Base Attack Bonus, Saves, DCs, skills) becomes so divergent at higher levels that specialized characters tend to always succeed at their core competencies and non-specialized characters tend to always fail. This is equally boring for both sides. The fighter will never try to make a Listen check, because her +1 bonus is worthless. The rogue doesn't really need to roll, because his +20 bonus is so good.

This disparity creates all sorts of unexpected secondary problems. For example, this is what fueled the "save or die" hatred. To my mind, the problem isn't that there are lethal abilities in the game that can just fucking kill you, period. I happen to think that's awesome. The problem is that at the higher levels, these spells tend to be anticlimatic. "The fighter needs to make a Will save? Fuuuuuuuuuuu."
 

Garnfellow

Other issues with high level play in 3e include cascading modifiers, action economy, and accumulation of abilities.

Cascading Modifiers: So much of the math of 3e interacts with other math. This gives the system a nice, crunchy underpinning, but it becomes a nightmare when a change to one number suddenly changes a half dozen or more other numbers.

In 1e, if you get hit by a ray of enfeeblement, that's also going to affect your to hit and damage rolls, and probably encumbrance. In 3e, that same spell affects all of that plus skills like Climb, Jump, and Swim. Dexterity is even worse, as it affects Initiative, AC, the Reflex Save, ranged attacks, and about eight different skills. At low level, this is manageable because of restrictions on stacking bonuses but at high levels it can become devilishly difficult to track all the buffs and nerfs and other things.

Action Economy At high levels, individuals can do more and more in round. They can take more actions and those actions do more damage. Lanchester's Square Law kicks in. A wizard might cast a spell and have a conjured bear attack. A cleric might buff the entire party. The fighter has multiple attacks. The rogue probably needs to move and sneak attack. This does a couple of things. It slows the game down, as each player needs more and more time to resolve their turn. But it also creates a situation where a solo monster is badly outgunned and if they lose initiative, they can get pulverized. So many encounters become very binary: if the group wins initiative, they cripple the baddie before she can even act, she gets one effect off, and then they finish her.

Again, this is something you wouldn't really see at lower levels and only becomes a problem at higher levels.

Accumulation of Abilities This is pretty straightforward. At higher levels everyone has so many resources, between just class abilities, feats, spells, and magic items, that it can bog the game down while players try to remember what is the "best" action to make. And at higher levels, 90% of these options are nigh worthless. A sleep spell? Forget it, this thing has too many HD and probably a decent Will save. Improved Trip? No way, it's Huge.
 

Bloody Stupid Johnson

#4
I think some of what they were thinking is going to determinable from a closer side-by-side look at 2nd edition, particularly the Player's Option books. About Dragon #263 or so (until 12 months/issues after that) they were counting down to 3E and there'd be articles there that are relevant.
There also used to be a section on the WOTC website on how the playtesters had influenced game development, have to dig up the link for that (if its still there).
EDIT: see links around this area... http://www.wizards.com/dnd/3E_Group_1099.asp
 
Iterative attacks I'm not so sure about actually have to research that one..they look like they just descend from the
fighter multiple attacks in AD&D, but, I'm not sure why they decided to give them away to all classes. Best guess would be so that multi-classed fighters (including fighter/rangers and fighter/paladins) didn't lose their multiple attacks. Why they're at -5 is another interesting question  (researching...)

There were a couple of related threads that might be interesting too, if you haven't seen them before?
-tangent in the 'underrated RPGs' thread that went on for a couple of pages...
http://www.therpgsite.com/showthread.php?t=14086&page=7  
 
and then there's a forked thread following on from that
http://www.therpgsite.com/showthread.php?t=19662

Bloody Stupid Johnson

Looked through some of the Dragons from #266 or so.
As far as math goes, the article that mentioned the save progression thing earlier is the jackpot; Dragon #274 has an article by Tweet on the core mechanic of d20+modifiers, as well as this earlier note:
 
Quote from: Jonathan Tweet"the most radical change is to have skills, saving throws, attack rolls, and ability checks all run off the same mechanic", Jonathan says. "You roll 1d20 to represent luck, you add a single modifier to represent your character's raw talent (that's your ability modifier), and you add another single modifier to represent your character's training and experience. The higher you get, the better. Now you can measure all your character's bonuses on the same scale. A +5 on an attack roll means about the same thing as a +5 on your Move Silently check, which means about the same thing as a +5 on your Fortitude saving throw".

Then in the article itself it...
*explains that the d20 is for 'whether' checks and other dice for 'how much' (damage) e.g. d2 for being kicked by a gnome
 
*explains that +0 is 'commoner standard'; an ordinary (non-adventurer with average stats) gets just a d20 with no bonuses, so that a 21 represents a result better than they can achieve.
 
*mentions that its a deliberate design decision that attack bonuses increase faster than Armour Class, but that damage is outpaced by increases in hit points. [essentially the same idea as in older D&Ds; attack bonus scales at about the same rate as THAC0].
( I think a lot of the ACs for monsters may actually have been further raised in the 3.5 revision..)
 
*as noted above, notes that a characters' best saves scale at the same rate as save DCs (+1 per 2 levels, vs +1 for a +1 to spell level). Notes also that "a character's lesser saves increase by only +1 per three levels, so characters actually lose ground in these areas. Magic items, spells, feats and improved hit points generally make up for this shortfall."
 
*Notes that DCs of saves are always partly based on an attacker's scores (i.e. presumably to balance the defender adding the bonus to their save). I guess Tweet is visualizing a "save DC" as being essentially the same as an opposed check the caster always 'takes 10' on ?
 
*on skills notes 'generally a character's best skills are going to get better more quickly than the enemies' skills are, so high-level rogues, for example, have an easier time sneaking around than they did at lower levels'
 
*notes that where skills are pitted against DCs based on spell level (i.e. a rogue trying to find a glyph of warding) the 'skill user can get ahead of the spellcaster, but that the consequences of failure are higher' - the example given being that the 5th level thief disarming a 5th level caster's glyph and failing gets a 3d8 Inflict Serious, while the 11th level thief vs. the 11th level glyph gets a slay living.

Bill

My opinion is that 3X and 4E would have been better served with 'smaller' numbers.

Meaning, a level 20 fighter in 3X needs only a +10 at the very most, to hit, from level alone. Not +20.

4E got that half right, with +1 per 2 levels, but.....goes to level 30.

Both 3x and 4e have too many bonuses as well.


Some may enjoy the large numbers but I feel they add nothing and often create problems.



If I were to build a 3X type dnd system I would control the numbers from the start.

Too many numbers added together equals...numbers that are too large :)

You don't need a +30 on history skill, or +30 to hit.

Kord's Boon

Quote from: Bloody Stupid Johnson;621514Iterative attacks I'm not so sure about actually have to research that one..they look like they just descend from the
fighter multiple attacks in AD&D, but, I'm not sure why they decided to give them away to all classes. Best guess would be so that multi-classed fighters (including fighter/rangers and fighter/paladins) didn't lose their multiple attacks. Why they're at -5 is another interesting question  (researching...)

I always assumed the generation of iterative attacks where an outgrowth of the 'streamlining' of 3e as a whole. Much like how ability bonuses now tended to apply the same way to all characters regardless of class, which as a consequence made for more straight-forward multicasting. A base-attack-bonus (BAB) increase always meant the same thing to everyone. (Why they did not say for instance "fighters are the only class that gets iterative attacks bases on high BAB" I don't know, perhaps to prevent everyone from taking a single level of fighter, now they just dip two for the feats.)

On a side note I recall reading that the simple additive multiclass option was a last minute accident and not foreseen as a consequence or goal of the streamlining.

As to why the iterative attacks progressed at a -5, it prevents too big a jump in character power. Going from level 4 with one attack (at +4) to two attacks (at +5) is a big jump for one level. Continuing the trend however was a mistake, a 4th attack at +5 at level 20 might as well not even be there unless you are abusing the system to net an insane to-hit, and the effect starts to get noticeable long before you hit 20.
"[We are all] victims of a system that makes men torture and imprison innocent people." - Sir Charles Chaplin

Bill

Quote from: Kord's Boon;621860I always assumed the generation of iterative attacks where an outgrowth of the 'streamlining' of 3e as a whole. Much like how ability bonuses now tended to apply the same way to all characters regardless of class, which as a consequence made for more straight-forward multicasting. A base-attack-bonus (BAB) increase always meant the same thing to everyone. (Why they did not say for instance "fighters are the only class that gets iterative attacks bases on high BAB" I don't know, perhaps to prevent everyone from taking a single level of fighter, now they just dip two for the feats.)

On a side note I recall reading that the simple additive multiclass option was a last minute accident and not foreseen as a consequence or goal of the streamlining.

As to why the iterative attacks progressed at a -5, it prevents too big a jump in character power. Going from level 4 with one attack (at +4) to two attacks (at +5) is a big jump for one level. Continuing the trend however was a mistake, a 4th attack at +5 at level 20 might as well not even be there unless you are abusing the system to net an insane to-hit, and the effect starts to get noticeable long before you hit 20.

I have always hated the extra attacks at -5/-10, etc...

I would have just given a +1 on damage per +5 base attack or something very simple. The extra attacks are mechanical bloat to me.

Garnfellow

Another unintended side effect of iterative attacks is that it discourages movement, so the fighter ends up rushing to the front and then just taking 5-ft. steps so as not to lose those iterative attacks.
 

Bill

Quote from: Garnfellow;621887Another unintended side effect of iterative attacks is that it discourages movement, so the fighter ends up rushing to the front and then just taking 5-ft. steps so as not to lose those iterative attacks.

I have seen some annoying 'Always run away from being adjacent' foolishness due to this as well.

Just say no to iterative attacks.

Kord's Boon

Quote from: Garnfellow;621887Another unintended side effect of iterative attacks is that it discourages movement, so the fighter ends up rushing to the front and then just taking 5-ft. steps so as not to lose those iterative attacks.

Indeed, also attacks of opportunity once you engaged.

On that note I do like that 5e is going back to 'run at opponent' and 'hit them' as all that really needs to be done to deal with spellcaster.

In comparison to 3e/3.5 where it was more like 'move adjacent to target' and 'ready action to move adjacent to target again when they end their move action' as to 'force them to provoke and attack of opportunity for casting in melee provided they did not "cast defensively" and if they do you're boned because you can pimp that concentration check with a magic items' but 'you can take a feat that prevents them from casting defensively if you even know it exist in the first place' or 'sacrifice damage to potentially trip, disarm or grapple them, which for several other reason may hurt you more then help' and so on.
"[We are all] victims of a system that makes men torture and imprison innocent people." - Sir Charles Chaplin

Garnfellow

I've been running a Pathfinder Beginner Box game, which has neither AoOs nor iterative attacks, and the results have been revelatory. The fights have been much more fun and dynamic. To say nothing of faster.
 

Garnfellow

As an aside, one potential way to fix AoOs that I've been tinkering with is to make them require a standard action, rather than be automatic. Call it "Defensive Stance." When you use this action, if anyone moves out of a square you threaten, you get to make an attack of opportunity.

This would suggest a feat like "Improved Defensive Stance," which would let you take a defensive stance as a swift action. This feat then becomes prerequisite for Combat Reflexes.

Taken together, this makes AoOs much rarer.
 

beejazz

Quote from: Garnfellow;621459Specific to 3e: In the 3e, they dropped the idea of a separate end-game and basically re-engineered the math so that the power curve from levels 1-10 extends smoothly to levels 11-20. So lower-level 3e plays a lot like lower level 1st or 2nd edition, but high level play looks pretty different.

Extending the power curve has a few unexpected side effects. The spread between capabilities (AC, Base Attack Bonus, Saves, DCs, skills) becomes so divergent at higher levels that specialized characters tend to always succeed at their core competencies and non-specialized characters tend to always fail. This is equally boring for both sides. The fighter will never try to make a Listen check, because her +1 bonus is worthless. The rogue doesn't really need to roll, because his +20 bonus is so good.

This disparity creates all sorts of unexpected secondary problems. For example, this is what fueled the "save or die" hatred. To my mind, the problem isn't that there are lethal abilities in the game that can just fucking kill you, period. I happen to think that's awesome. The problem is that at the higher levels, these spells tend to be anticlimatic. "The fighter needs to make a Will save? Fuuuuuuuuuuu."

So much this. I've always felt the effects of the widening gap between high and low saves should have been much easier to predict and avoid.

Maybe the gap widened because starting with big bonuses in some classes might have encouraged cherry picking in the multiclassing rules? But then they still have level 1 bonuses that could encourage the same.

Quote from: Bloody Stupid Johnson;621796Looked through some of the Dragons from #266 or so.
I'll have to look over this stuff more when I've got a little more time. Thanks for the links though. I would not have known where to start looking myself.
 
Quote*mentions that its a deliberate design decision that attack bonuses increase faster than Armour Class, but that damage is outpaced by increases in hit points. [essentially the same idea as in older D&Ds; attack bonus scales at about the same rate as THAC0].
( I think a lot of the ACs for monsters may actually have been further raised in the 3.5 revision..)
Attacks progressing faster than AC makes sense two ways. First it means you can generally offset higher penalties (shoot equal foes from farther away etc.) and second it should (ostensibly) mean that attacking doesn't become infeasible  for non-combat-specialized characters. In practice the latter often wasn't the case, but if ACs were closer to lockstep with fighters in 3.5 that might explain my experience (mostly with 3.5).
 
Quote*as noted above, notes that a characters' best saves scale at the same rate as save DCs (+1 per 2 levels, vs +1 for a +1 to spell level). Notes also that "a character's lesser saves increase by only +1 per three levels, so characters actually lose ground in these areas. Magic items, spells, feats and improved hit points generally make up for this shortfall."
And this is the weirdest bit for me. Partly for the reasons Garnfellow described (either good save characters would always save or low save characters would always fail at some levels and against some effects) but also because players were expected to patch this gap with optional materials.

The designers were building the math on the assumption that people would play a particular way. They assumed that people would recognize their mathematical shortcomings (they tend not to in casual play IME), and then find or buy these math-patchy items. Why not just build the game to play the way they wanted and leave the optional stuff... optional?

And from what I understand, this doesn't sound like the way the game was played before either (the always/never save problem, or the assumption that people needed particular stuff to fix it).
 
Quote*notes that where skills are pitted against DCs based on spell level (i.e. a rogue trying to find a glyph of warding) the 'skill user can get ahead of the spellcaster, but that the consequences of failure are higher' - the example given being that the 5th level thief disarming a 5th level caster's glyph and failing gets a 3d8 Inflict Serious, while the 11th level thief vs. the 11th level glyph gets a slay living.
Personally I prefer something kind of opposite (high level characters slay living with impunity against a world full of peasants, but use their bread and butter inflict spells on equal foes). I can see the logic in either option though. Which was it more like before 3?