This is a site for discussing roleplaying games. Have fun doing so, but there is one major rule: do not discuss political issues that aren't directly and uniquely related to the subject of the thread and about gaming. While this site is dedicated to free speech, the following will not be tolerated: devolving a thread into unrelated political discussion, sockpuppeting (using multiple and/or bogus accounts), disrupting topics without contributing to them, and posting images that could get someone fired in the workplace (an external link is OK, but clearly mark it as Not Safe For Work, or NSFW). If you receive a warning, please take it seriously and either move on to another topic or steer the discussion back to its original RPG-related theme.

Attributes - why quantify the average?

Started by Fighterboy, February 15, 2022, 03:33:56 PM

Previous topic - Next topic

HappyDaze

Quote from: Pat on February 16, 2022, 07:12:10 PM
Quote from: AtomicPope on February 16, 2022, 07:05:27 PM
Ars Magica uses only integers for stats.  That stat is the modifier.  If you have no modifier then the stat is zero.  When I first saw it I thought, "of course!"  Unfortunately for D&D we're stuck in a weird tradition of tables and charts from Gary's time as an insurance clerk in Chicago.  The way I use "zero" stats in D&D is whenever there's a tie in a contest the higher stat wins.
One argument against 0-centered stats is psychological. Shifting a -3 to +3 scale up 4 points to a 1 to 7 scale makes some people happier, because a 4 sounds better than a 0, even if the mechanics are perfectly equivalent in all other ways. This seems to be factor with 0s and negatives, and can lead to stat inflation because there's pressure to avoid the lower half of the scale.
Beyond this, in dice pool systems, the value tells you how many dice to roll and rolling less than 0 dice is unfun.

Wulfhelm

#31
Quote from: Pat on February 16, 2022, 05:53:29 PM
But a PC in a RPG that uses save or dies will have to make save or dies checks over the course of their career. If the chance of failing goes from 5% to 15%, then the character's expected lifespan will drop by a factor of 3. If a character normally lives for 60 sessions, they'll now have an expected lifespan of 20 sessions.
Hm, no?
If you make a save-or-die roll once per session, as you seem to imply, a character with the +2 bonus has a <50% chance of surviving 14 sessions. Or a <50% chance of surviving 5 sessions without it.

But more to the point, neither character even in this extreme scenario, with the lowest theoretically possible chance of dying on a single D20 roll, is likely to make it through an entire campaign. And even more to the point, an extreme result is going to do them in.

If you make save-or-die scenarios, which typically don't happen at this extreme range, but are more likely to happen at differences of, say "need 7 or more to survive" and "need 5 or more to survive", a regular feature, your characters will not survive for long unless you either cheat (by means of an actual rule mechanic or otherwise) or you are unbelievably lucky.

As I said, you picked the most extreme scenario possible. If the +2 bonus, as it more typically does, shifted your chances from 13 in 20 to 15 in 20, you would not survive, on average, even two occurrences in the former case and less than three in the latter case.

But then again, these are theoretical possibilities, not actual die rolls. In all of these cases, it is quite likely that your bonus never ever matters even once. Do you understand that and do you understand why?

I'm really having flashbacks to a former 3.5 group of mine here. Even after I had hammered home the point that when making saves it was a considerable time saver to just roll the die before calculating their bonus to +2 or +3 accuracy with all the buffs, even after it had been practically demonstrated to them that this almost always worked, they could not wrap their heads around the concept.

Lunamancer

Quote from: Wulfhelm on February 17, 2022, 03:09:38 PM
Quote from: Pat on February 16, 2022, 05:53:29 PM
But a PC in a RPG that uses save or dies will have to make save or dies checks over the course of their career. If the chance of failing goes from 5% to 15%, then the character's expected lifespan will drop by a factor of 3. If a character normally lives for 60 sessions, they'll now have an expected lifespan of 20 sessions.
Hm, no?
If you make a save-or-die roll once per session, as you seem to imply, a character with the +2 bonus has a <50% chance of surviving 14 sessions. Or a <50% chance of surviving 5 sessions without it.

Some friendly advice. If you aren't actually answering the question, you're wrong. Even if your statements and your math are technically correct, if you're not addressing what's at issue, you're wrong. This dude said you'll survive 3 times as long. He spat-balled some numbers. His numbers might not be correct--pending the specifics of an unknown number of unstated variables--but the ratio you yourself ended up with came out really close to 3 to 1. So why are you even arguing here?

QuoteBut more to the point, neither character even in this extreme scenario, with the lowest theoretically possible chance of dying on a single D20 roll, is likely to make it through an entire campaign. And even more to the point, an extreme result is going to do them in.

If you make save-or-die scenarios, which typically don't happen at this extreme range, but are more likely to happen at differences of, say "need 7 or more to survive" and "need 5 or more to survive", a regular feature, your characters will not survive for long unless you either cheat (by means of an actual rule mechanic or otherwise) or you are unbelievably lucky.

Enter the unstated variables. None of this hiding behind abstraction and isolated hypotheticals. What are we actually talking about here? In my experience, the most common form of Save or Die are saves versus poison. And it's most common that a successful attack roll is needed to deliver the poison. So it's not just roll 7 or better. Maybe the snake needs to roll 17 or better just to hit. Here's another variable. What if the fighter manages to slay the snake before the snake even gets to attack? That enters initiative, hit, and damage rolls into the equation.

That makes the math a lot more complicated. But this is what actual play looks like. Now what if we're talking about Dex? That could affect the initiative score. If the fighter is fighting with two weapons, Dex could mitigate two-weapon fighting penalties. And then Dex affects the snake's chance to hit. At the end of it, you're going to find the chance of the fighter dying from a failed save is going to end up a lot lower than 5%, and the impact of the 2-point Dex adjustment is going to cut the chance down by more than half (we halve the chance just by the snake's hit roll alone).


QuoteBut then again, these are theoretical possibilities, not actual die rolls. In all of these cases, it is quite likely that your bonus never ever matters even once. Do you understand that and do you understand why?

I'm really having flashbacks to a former 3.5 group of mine here. Even after I had hammered home the point that when making saves it was a considerable time saver to just roll the die before calculating their bonus to +2 or +3 accuracy with all the buffs, even after it had been practically demonstrated to them that this almost always worked, they could not wrap their heads around the concept.

Well, continuing with my example above, the adjustment will matter in 14.25% of initiative roll pairs (chance of bumping losses to wins, losses to ties, and ties to wins), 10% of the fighter's hit rolls (of which there will be at least 3, depending on edition of the game, considering the fighter's level, potential specialization, and use of a second weapon), and 10% of the snake's hit rolls, assuming the snake survives. All totaled, the 2-point DEX adjustment is going to matter at least 43.7% of the time. Of course in actual play there could also be STR and CON bonuses to consider, as well as magical adjustments.

I completely understand what it is you're saying. It's just that we generally don't resolve situations with a single die roll, and these attributes affect more than just one thing. The totality of actual play is going to produce very different results from your theory. So I would strongly recommend less hammering and more humility.
That's my two cents anyway. Carry on, crawler.

Tu ne cede malis sed contra audentior ito.

Pat

#33
Quote from: Wulfhelm on February 17, 2022, 03:09:38 PM
Quote from: Pat on February 16, 2022, 05:53:29 PM
But a PC in a RPG that uses save or dies will have to make save or dies checks over the course of their career. If the chance of failing goes from 5% to 15%, then the character's expected lifespan will drop by a factor of 3. If a character normally lives for 60 sessions, they'll now have an expected lifespan of 20 sessions.
Hm, no?
If you make a save-or-die roll once per session, as you seem to imply, a character with the +2 bonus has a <50% chance of surviving 14 sessions. Or a <50% chance of surviving 5 sessions without it.

But more to the point, neither character even in this extreme scenario, with the lowest theoretically possible chance of dying on a single D20 roll, is likely to make it through an entire campaign. And even more to the point, an extreme result is going to do them in.

If you make save-or-die scenarios, which typically don't happen at this extreme range, but are more likely to happen at differences of, say "need 7 or more to survive" and "need 5 or more to survive", a regular feature, your characters will not survive for long unless you either cheat (by means of an actual rule mechanic or otherwise) or you are unbelievably lucky.

As I said, you picked the most extreme scenario possible. If the +2 bonus, as it more typically does, shifted your chances from 13 in 20 to 15 in 20, you would not survive, on average, even two occurrences in the former case and less than three in the latter case.

But then again, these are theoretical possibilities, not actual die rolls. In all of these cases, it is quite likely that your bonus never ever matters even once. Do you understand that and do you understand why?

I'm really having flashbacks to a former 3.5 group of mine here. Even after I had hammered home the point that when making saves it was a considerable time saver to just roll the die before calculating their bonus to +2 or +3 accuracy with all the buffs, even after it had been practically demonstrated to them that this almost always worked, they could not wrap their heads around the concept.
Yes, I did choose the most extreme example. Because it illustrates the point I'm making. Go back and read my post that started this whole tangent. My entire point was that the effects of a bonus depends on where in the d20 range it falls.

Which you're finally acknowledging now, which I suppose is progress. Though you're also denying it, and combining it with a patronizing arrogance. So baby steps, I guess.

You are correct that most saves aren't quite that extreme. But your baseline assumptions aren't universal. For instance, you're using 5+ or 7+ as typical for saves, and that's not true across all editions. In AD&D, a 1st level fighter needs a 17+ to save vs. spells. More generally, in old school variants of D&D, saves follow a progression where characters have a small chance at success at low levels, and only improve into your range at the very highest of levels, and often not even then -- a 100th level thief in AD&D, for instance, has saves as bad as 11+ (breath weapon).

This gets more complex in third edition, where saves turn into an arms race against DCs. Whether you need a 2+ to save, or a natural 20, depends not just on a character's level, but weak vs. strong saves, DC optimization (usually involving spells), the weird effects of monster HD, and other factors. By mid level, saving against an effect that targets one of your strongest saves becomes very likely, while saving against an effect that targets one of your weakest saves becomes unlikely. And at the highest levels, this variance can easily exceed the d20 range, meaning only a natural 1 or a natural 20 matters.

D&D does tend toward extremes in that regard, particularly older editions. This also comes up with attacks in old school D&D, where characters start with a poor chance to hit many opponents, but by high levels are hitting almost all the time. More generally, across the entire RPG space, a lot of designers aim at giving PCs a good chance of success on positive actions. That means skill checks, attacks, and so on often have about a 70% chance of success. This is inverted for negative effects, like saves of various kinds. Which more closely maps to your assumptions. And while these won't all lead to a 3:1 difference in survival, if you're talking about bonuses in the -5/+5 range (not atypical for D&D) when the baseline is about a 70% chance of success, then you run into the end of the d20 spectrum fairly frequently. And when that happens, the effect of a +2 can shift fairly dramatically.

Palleon

Sure it's fine, if you're using a simple system wherein the attributes cannot be modified during game play.  Otherwise the distribution of normal generated stats needs to be preserved.

I don't care for systems where everything remains static to when I created the character for anything more than one shots.

Eric Diaz

Quote from: Cat the Bounty Smuggler on February 16, 2022, 12:58:46 PM
Quote from: Wulfhelm on February 16, 2022, 05:28:26 AM
That is just a matter of granularity. Attributes could still be quantified beyond "bad, normal, good". It's relatively simple: A range of X (=15 for attributes from 3 to 18) is only useful for rulings if you want X possible outcomes. I mean, what if she had Cha 11? Or Cha 13? In my experience, that level of granularity usually doesn't apply, and even a judgment like "she has to try a little harder" makes little difference to "it works" in practice.

The ranges don't have to be the same for every situation, though. This NPC requires a Cha 14 to bluff, that one is more gullible and only needs an 11, and that one over there is a master spy who won't fall for anything less than Cha 17. And as @Eric Diaz said, with a roll-under or even a modified d% task, every point counts.

It's ultimately a matter of taste, of course. My point is that the old style was to use anything and everything at your disposal to make rulings, and the full 3-18 range is one of the things that can be used.

Quote from: Eric Diaz on February 16, 2022, 10:46:35 AM
Depends on the system... If you use roll d20 under attribute, obviously every point counts.

And players often like these small details about their PCs.

For NPCs, however, its +0s all over, unless that particular individual deserves a distinction.

Even better: just stat NPCs as monsters. If something is going to be relevant outside of combat, you can note it in the description.




Oh, and I thought of another reason: 0e and B/X allow you to move points around at character creation. So at least there, the actual scores matter.

Yes, sure; 99% of NPCs do not require unique stats.
Chaos Factory Books  - Dark fantasy RPGs and more!

Methods & Madness - my  D&D 5e / Old School / Game design blog.

Wulfhelm

Quote from: Lunamancer on February 18, 2022, 02:16:03 AMSome friendly advice. If you aren't actually answering the question, you're wrong.
Where did you see a question in his posting? (<- That was a question.)

QuoteEnter the unstated variables. None of this hiding behind abstraction and isolated hypotheticals. What are we actually talking about here? In my experience, the most common form of Save or Die are saves versus poison. And it's most common that a successful attack roll is needed to deliver the poison. So it's not just roll 7 or better. Maybe the snake needs to roll 17 or better just to hit. Here's another variable. What if the fighter manages to slay the snake before the snake even gets to attack? That enters initiative, hit, and damage rolls into the equation.
Then the actual save-or-die roll becomes so rare that a +2 bonus on it is not likely to ever matter over the course of an entire campaign.

It is actually very simple, and remains so: Such a small bonus is irrelevant for 90% of all single rolls. To become statistically relevant, it needs to come into play very often. If a severe in-game consequence (e.g. character death) is tied to it, and the roll comes into play very often, then even in extreme cases, but most definitely in more typical ones, said severe consequence is likely to happen sooner rather than later.

This reminds me of one former DM of mine who insisted that every combat should be designed so that it would be an extremely close affair and the party had a good chance of losing catastrophically. I told him in no uncertain terms this meant, statistically, that unless combat was going to be very rare, that would inevitably result in TPKs or near-TPKs very soon. He didn't believe me. But before long he started fudging results to avoid the very scenario I had predicted.

Wulfhelm

Quote from: Pat on February 18, 2022, 09:41:17 AMYes, I did choose the most extreme example. Because it illustrates the point I'm making.
That approach doesn't have a lot of merit when talking about practical applications. Just sayin'.

QuoteThis gets more complex in third edition, where saves turn into an arms race against DCs. Whether you need a 2+ to save, or a natural 20, depends not just on a character's level, but weak vs. strong saves,
It also depends on having a +18 difference in saves. Not +2. You're not going to get any argument from me against a +18 bonus being relevant in almost all cases (90% of all single rolls to be exact, so the precise inversion of the situation with the +2 bonus.)
But a +2 bonus only going to be relevant every 10 rolls on average. Now there's one area of D&D derivates (including the main perp 5E) where this can and does happen: Combat, at least at higher levels. Simply because many attack rolls are usually needed to shave down a high-level entity's hit points.
(There's one other area, perception, but that is another gripe of mine in general.)

But for things which you don't roll that often, it's different. A +2 bonus on D20 becomes relevant every 10 rolls, statistically, regardless of the actual target numbers involved. So if you only use that roll 2 or 3 times over the course of a session (or campaign?) it is not likely to ever become relevant over the course of that session (or campaign.)
Small bonuses relative to the dice roll range only become relevant of you roll very often. I really don't know how I can elaborate further on this basic fact.

Pat

Quote from: Wulfhelm on February 19, 2022, 01:54:35 PM
Quote from: Pat on February 18, 2022, 09:41:17 AMYes, I did choose the most extreme example. Because it illustrates the point I'm making.
That approach doesn't have a lot of merit when talking about practical applications. Just sayin'.
I am talking about practical applications. There are situations where someone has a 15% chance of failure, and a +2 bonus means they end up with a 5% chance of failure. It's not even particularly uncommon.

Quote from: Wulfhelm on February 19, 2022, 01:54:35 PM
QuoteThis gets more complex in third edition, where saves turn into an arms race against DCs. Whether you need a 2+ to save, or a natural 20, depends not just on a character's level, but weak vs. strong saves,
It also depends on having a +18 difference in saves. Not +2. You're not going to get any argument from me against a +18 bonus being relevant in almost all cases (90% of all single rolls to be exact, so the precise inversion of the situation with the +2 bonus.)
But a +2 bonus only going to be relevant every 10 rolls on average. Now there's one area of D&D derivates (including the main perp 5E) where this can and does happen: Combat, at least at higher levels. Simply because many attack rolls are usually needed to shave down a high-level entity's hit points.
(There's one other area, perception, but that is another gripe of mine in general.)

But for things which you don't roll that often, it's different. A +2 bonus on D20 becomes relevant every 10 rolls, statistically, regardless of the actual target numbers involved. So if you only use that roll 2 or 3 times over the course of a session (or campaign?) it is not likely to ever become relevant over the course of that session (or campaign.)
Small bonuses relative to the dice roll range only become relevant of you roll very often. I really don't know how I can elaborate further on this basic fact.
Have you ever played third edition? At high levels, an +18 difference between the saves of characters in the same party is common. And that has nothing to do with the +2, because the 18 point difference I described is between different characters, while the +2 bonus we've been talking about is a bonus that's applied to a particular character's roll. Not even vaguely the same thing.

Your +2 is only comes up 1 in 10 times also utterly irrelevant to the points I've been making. I've been talking about how the probabilities change as the target number changes, and the effect of low probability but significant effects over time/many rolls, and how small bonuses can multiply their effect.

You're either not reading what I'm writing, or you have such a weak understanding of the topic that you can't even grasp the most basic of concepts.

VisionStorm

This thread is about whether or not a +2 bonus in a d20 game is significant now.  :o

Somewhere between "A +2 bonus could snatch you from the jaws of death!" and "a +2 bonus is completely and utterly irrelevant" the truth lies.

Fact of the matter is as mundane as a +2 bonus may sound, most characters don't have ONLY a +2 bonus. They usually have a +2 bonus from a stat, +5 from a skill, +3 from a Feat or something, plus probably some buffer effect or a magic item or something, that's already a +10 total from those three things alone, which is half the variable range in a d20. And depending on the edition and the level, characters could get way higher than that. So how many bonuses can we add to the same roll and how high can each of those individual bonuses be before it gets ridiculous? And how much room for growth do you have left if you blow everything into making attribute bonuses high from the onset?

That +2 bonus sounds low on its own, but that's usually just an extra you add on top of a bunch of other crap. And attributes are the most general abilities in the game, so they can't be too high, cuz they affect everything tied to that ability, including derived stats like HP or stuff like damage.

Wulfhelm

#40
Quote from: VisionStorm on February 19, 2022, 08:51:32 PMSomewhere between "A +2 bonus could snatch you from the jaws of death!" and "a +2 bonus is completely and utterly irrelevant" the truth lies.
Yes. It lies with "a +2 bonus is mostly irrelevant". Which is what I'm saying. Confronted with a situation where a d20 roll can snatch you(r character, presumably) from the jaws of death, in 90% of all such cases a +2 bonus won't have helped.

This is completely a tangent now, but one problem with this is that a lot of D20-based systems try and sell a +2 bonus as some major difference in character competence; e.g. the proficiency bonus difference between a 1st-level and a 10th-level character in 5E or the difference between an average Str 10 schlub and the Str 15 village strongman in 3.x.

Of course that is just as related to the randomness of D20- or more generally dice-based resolutions as to the specific range, of course. If you say "you need to beat DC 13 to lift the gate", the Str 10 schlub is obviously 80% as likely to do it as the village strongman ("... lift with the legs, Sir Rogar!")
If OTOH you say "you need Str 13+ to lift the gate" or possibly " you need [rolls 1D6+10] Str 13+ to lift the gate", you have a different situation and the problem vanishes. Of course the latter approach, if systematized, the latter approach also has its own randomness problems.

Pat

#41
Quote from: VisionStorm on February 19, 2022, 08:51:32 PM
This thread is about whether or not a +2 bonus in a d20 game is significant now.  :o
That's not what we've been discussing. My entire point, from the very start, is that the effect of a +2 bonus depends on where you are in the d20 range.

For instance, let's say you're a 1st level fighter in AD&D. You're fighting an opponent with reasonable armor, so you need to roll a 20 to hit. But let's say you get a +2 bonus to hit for some reason. That means you now hit on an 18, 19, or 20. Three times as often. Which means a +2 bonus results in a threefold increase in damage, over time, against opponents with that AC.

That's huge. A +2 to hit means a lot in that circumstance. It also means a lot when you have a small chance to save against a save or die effect. But the importance of a +2 diminishes dramatically, as we move away from that end of the d20 range. If you need an 12+ to hit, then a +2 means you need a 10+ hit. That means your chance to hit increases from 9 out of 20, to 11 out of 20. That's 11/9, or a 22% improvement. In the middle of the range, a +2 means a x1.22 increase in damage, over time. That's much, much smaller.

Here's effect at each point in the range:

Base roll needed to hit / Increase in average damage over time with a +2 bonus to hit
20   x3
19   x2
18   x1.67
17   x1.5
16   x1.4
15   x1.33
14   x1.29
13   x1.25
12   x1.22
11   x1.2
10   x1.18
9   x1.17
8   x1.15
7   x1.14
6   x1.13
5   x1.13
4   x1.12
3   x1.11
2   x1.11

While x3 damage is huge, look at what happens as you shift further down the table. Even a single point improvement in your chances means a +2 bonus drops from x3 to x2, which is a big drop. And further down the scale, it basically plateaus. The difference between x1.13 and x1.13 is literally a rounding error at two digits.

And note it's never 10%. A +10% shift along the spectrum never results in a +10% increase in the chances of success.

That's my point. It's useful to recognize that a +2 bonus means a lot if you're at the right end of the spectrum, but the effect drops steeply as you move away from that end.

Wulfhelm

Quote from: Pat on February 20, 2022, 04:44:43 AMThat's not what we've been discussing. My entire point, from the very start, is that the effect of a +2 bonus depends on where you are in the d20 range.
... in response to my assertion that in 90% of all actual rolls, a +2 bonus is irrelevant, which is a simple fact unaffected on where you are in the d20 range. Whether it is the difference between needing 1 vs. 3, 7 vs. 9 or 17 vs. 19 - in 90% of all these cases, the bonus will not have mattered.

So, on that front, you either a.) simply refused to accept statistical reality or b.) don't like that aspect of statistical reality and thus built a strawman from a different aspect.

But let's talk about that strawman:

QuoteFor instance, let's say you're a 1st level fighter in AD&D. You're fighting an opponent with reasonable armor, so you need to roll a 20 to hit. But let's say you get a +2 bonus to hit for some reason. That means you now hit on an 18, 19, or 20. Three times as often. Which means a +2 bonus results in a threefold increase in damage, over time, against opponents with that AC.

a.) It also means that even with the bonus, in 85% of all combat rounds you miss. So what you have here is either a boring fight with an extremely high whiff factor, especially if "I attack" is the only sensible option for your 1st level fighter or...
b.) ... you're fighting an opponent who severely outclasses you and who will very likely kill you before your +2 bonus ever becomes relevant.
c.) If on the other hand, you are fighting another 1st level (or otherwise low hp) opponent who just happens to have an AC of 0 or lower, and you also have the same amount of armor, then yes, you are likely to luck out before he does. But it would just be that: Lucking out. And for the specific example, the random damage roll is going to be much more relevant than the to-hit roll in such scenarios.

In a different scenario with higher-level characters but the same sort of to-hit probabilities: That is basically the only scenario in D&D derivative games where such a small bonus becomes a discernable advantage: Because, as I explained earlier, it is only relevant when you roll a lot - and with high hit point totals that becomes feasible because characters no longer go down in one or two hits.

Wulfhelm

And a couple more remarks on your combat (why always combat?) example:

1.) Base roll needed to hit / Average number of combat rounds until a +2 bonus becomes relevant.
20   10
19   10
4-18 10
3   9... just kidding, it's 10
2     see below

2.) Only when the base number needed to hit is exactly 20, you statistically hit more often thanks to the bonus than you would have hit without it. If it is 19, the number is even, and for all lower scenarios, the of hits you score with the bonus increasingly outnumber the hits you needed it for.

3.) Speaking of that: If the base number is higher than 20, which is a possibility in all D&D variants I know, then the bonus becomes less relevant again. If in PF 1st, you have AC 22 and your opponents have +2 and +0 bonuses respectively, then the +2 bonus obviously has no relevance whatsoever. Similarly so when the base number is below three. So I admit wrt combat, I was incorrect in stating that for all scenarios, in 90% of all actual rolls the bonus does not matter. To correct myself: In at least 90% of all actual rolls, the +2 bonus does not matter.

4.) Speaking some more of that: In games where a "natural" 20 is an automatic critical hit, the bonus again becomes even less relevant.

In any event: Point still stands - small bonuses are rarely relevant for individual rolls, thus they become relevant statistically only when you make a lot of rolls, for which combat is the go-to example if not the only one.

VisionStorm

OK, now that we've established that we're just talking pass each other and quibbling about statitic anomalies (or whatever term would fit without turning this into a protracted semantic discussion) at the lower or higher ends of the spectrum. What would be a way to handle rolls and ability increments that would make them feel more relevant, while still allowing some room for growth? Or is that even a possibility in d20+Mod mechanic systems (or other systems for that matter)?

Cuz it seems to me that simply making modifiers larger (for example) would just create its own share of problems, like limiting room for growth or creating its own statistical anomalies, where getting any bonus would make average difficulty tasks trivial and near-automatic success, but not getting a bonus would make any (or most) high difficulty task a whiff fest.




Also (on a separate, but related note) the issue with Strength tests in the D&D/d20 System is that they exist (or at least should exist) on a separate scale than most skill rolls, but are still handled using the same mechanics as skills despite there being no skill bonus to add on top of the attribute bonus to the roll. So it creates a circumstance where the ONE ability that gives you any possible bonuses to that specific type of roll doesn't provide a high enough bonus for it to be meaningful.

And much like combat, this is the ONLY example where this stuff normally happens*, cuz most tasks related to other attributes can potentially benefit from some skill or another to modify the base stat and give you extra bonuses to make up the difference. If you're attempting some acrobatic stunt, for example, that's clearly tied to Dexterity, but it's also reliant on your Acrobatics skill rather than just raw ability. So you'd get multiple stacking bonuses to determine your total modifier for your roll, allowing for greater differences between someone with the proper abilities (plural) for that task and someone without them.

The same could potentially apply to just about any other raw attribute roll. A knowledge related roll, for example, would clearly based on INT, but if you have no skill related to that type of knowledge, your raw INT alone shouldn't give you that much of an advantage, even though a smarter character obviously should get a higher change of recalling a bit of relevant info.

It's only on STR vs STR rolls where we get into this issue where STR is the ONLY obvious ability that would fit the roll, and a +2 bonus is supposed to represent "high" STR, which SHOULD be significant for that roll and that type of roll alone*, but barely gives you any advantage, cuz you only get your Attribute bonus alone, but the system is built under the assumption that Attribute bonuses are only an extra you add on top of other stuff (such as a Skill or Combat Bonus) to supplement it. And attribute bonus ranges reflect that assumption. So proper STR vs STR rolls all apart, cuz pure raw attribute bonuses are build on a scale that assumes you always get another modifier and your attribute bonus is just an extra you add on top. Except that a STR vs STR test is a stand alone roll where STR is the only relevant ability and every point of difference should give you a significant edge over someone who doesn't have a bonus, yet it doesn't.

*and even if someone where to come up with an example of another attribute having a similar issue (which I can't think of one, but maybe there's an example out there) I would submit that every single one of those potential examples would be outliers rather than common rolls. And probably less likely to come up than a STR vs STR roll. But the absolute VAST majority of potential task rolls in a RPG OTHER than STR vs STR (or some pure STR rolls, like breaking stuff) would have SOME hypothetical skill/combat bonus or whatever tied to it that would make up the difference when determining the ultimate range of possible modifiers you could get for the roll. It's only on raw Attribute vs Attribute (or pure attribute) rolls that this happens, and only STR vs STR where it seems that the attribute itself should also be effectively the "skill" as well. But there's no mechanic to make up for the difference.