SPECIAL NOTICE
Malicious code was found on the site, which has been removed, but would have been able to access files and the database, revealing email addresses, posts, and encoded passwords (which would need to be decoded). However, there is no direct evidence that any such activity occurred. REGARDLESS, BE SURE TO CHANGE YOUR PASSWORDS. And as is good practice, remember to never use the same password on more than one site. While performing housekeeping, we also decided to upgrade the forums.
This is a site for discussing roleplaying games. Have fun doing so, but there is one major rule: do not discuss political issues that aren't directly and uniquely related to the subject of the thread and about gaming. While this site is dedicated to free speech, the following will not be tolerated: devolving a thread into unrelated political discussion, sockpuppeting (using multiple and/or bogus accounts), disrupting topics without contributing to them, and posting images that could get someone fired in the workplace (an external link is OK, but clearly mark it as Not Safe For Work, or NSFW). If you receive a warning, please take it seriously and either move on to another topic or steer the discussion back to its original RPG-related theme.

Character Generation: Do you prefer 3d6, 4d6, Straight Down, Arrange to Taste?

Started by Jam The MF, June 19, 2021, 12:07:56 AM

Previous topic - Next topic

Pat

Quote from: Eirikrautha on June 21, 2021, 10:17:39 PM
Quote from: Pat on June 21, 2021, 09:43:09 PM
Quote from: Eirikrautha on June 21, 2021, 09:33:52 PM
Quote from: jhkim on June 21, 2021, 04:31:36 PM
Quote from: Eirikrautha on June 21, 2021, 03:25:29 PM
Both of these hit on an important point. In early D&D, bonuses were mostly earned via class progression and/or magic items.  Stats didn't go up naturally via leveling (that was completely a feature of 3e and later).  So your stats really didn't have much effect beyond a small attribute bonus, level limits, and an experience bonus.  The method of stat generation has only really become an issue once stats became more important to character utility and performance...

That wasn't my experience with AD&D first edition. I always felt like the AD&D1 ability score rolls were extremely important - moreso than later editions. Later editions smoothed out the bonuses so there was more of an incremental change as you got better, and removed the level limits and experience bonus based on stats. Among groups I played, getting an 18 in your primary stat was a big deal -- much bigger than in later editions when 18 is just a minor incremental change from 16. Most of the attributes charts had sharp changes in the 15-18 range, and further, high stats let you be limited character types like a ranger or paladin that weren't available at all otherwise.

The main offset to this was just that there tended to be more randomness in adventures, so even a great stat character could fail a save-or-die effect, for example, or get level drained.
Please explain how, to a non-fighter, the difference between an 8 or an 18 is a "big deal."  Explain how it has " sharp changes in the 15-18 range."
Or how a 14 in con is so much better than a 7.  Or how a non-fighter experiences a "sharp change" in bonuses above 14.  You must have been using a very different edition of AD&D than the rest of us...

Edit: One of those charts is from 2e, but the bonuses are the same...
Where did you get most of that? He specifically said an 18 in a primary stat was a big deal, which means fighters, not "non-fighters". And he was talking about the bonuses in the 15-18 range, and didn't say a thing about how a 14 was much better than a 7.
"Most of the attributes charts had sharp changes in the 15-18 range".  Don't see primary stat anywhere in that sentence.

But even assuming he meant for the previous mention to apply here... nope... they didn't.  Con, even for fighters, gives only hp (the other effects are in single digit percent).  And 18, even if you were a fighter and used percentile strength (which was optional), you went from +1 to +2 to hit unless you roll a 00.  Wow, that +1 is game-changing...
I highlighted the primary for you.

Eirikrautha

Quote from: Pat on June 21, 2021, 11:04:32 PM
Quote from: Eirikrautha on June 21, 2021, 10:17:39 PM
Quote from: Pat on June 21, 2021, 09:43:09 PM
Quote from: Eirikrautha on June 21, 2021, 09:33:52 PM
Quote from: jhkim on June 21, 2021, 04:31:36 PM
Quote from: Eirikrautha on June 21, 2021, 03:25:29 PM
Both of these hit on an important point. In early D&D, bonuses were mostly earned via class progression and/or magic items.  Stats didn't go up naturally via leveling (that was completely a feature of 3e and later).  So your stats really didn't have much effect beyond a small attribute bonus, level limits, and an experience bonus.  The method of stat generation has only really become an issue once stats became more important to character utility and performance...

That wasn't my experience with AD&D first edition. I always felt like the AD&D1 ability score rolls were extremely important - moreso than later editions. Later editions smoothed out the bonuses so there was more of an incremental change as you got better, and removed the level limits and experience bonus based on stats. Among groups I played, getting an 18 in your primary stat was a big deal -- much bigger than in later editions when 18 is just a minor incremental change from 16. Most of the attributes charts had sharp changes in the 15-18 range, and further, high stats let you be limited character types like a ranger or paladin that weren't available at all otherwise.

The main offset to this was just that there tended to be more randomness in adventures, so even a great stat character could fail a save-or-die effect, for example, or get level drained.
Please explain how, to a non-fighter, the difference between an 8 or an 18 is a "big deal."  Explain how it has " sharp changes in the 15-18 range."
Or how a 14 in con is so much better than a 7.  Or how a non-fighter experiences a "sharp change" in bonuses above 14.  You must have been using a very different edition of AD&D than the rest of us...

Edit: One of those charts is from 2e, but the bonuses are the same...
Where did you get most of that? He specifically said an 18 in a primary stat was a big deal, which means fighters, not "non-fighters". And he was talking about the bonuses in the 15-18 range, and didn't say a thing about how a 14 was much better than a 7.
"Most of the attributes charts had sharp changes in the 15-18 range".  Don't see primary stat anywhere in that sentence.

But even assuming he meant for the previous mention to apply here... nope... they didn't.  Con, even for fighters, gives only hp (the other effects are in single digit percent).  And 18, even if you were a fighter and used percentile strength (which was optional), you went from +1 to +2 to hit unless you roll a 00.  Wow, that +1 is game-changing...
I highlighted the primary for you.
I highlighted the part where it didn't matter for you.
"Testosterone levels vary widely among women, just like other secondary sex characteristics like breast size or body hair. If you eliminate anyone with elevated testosterone, it's like eliminating athletes because their boobs aren't big enough or because they're too hairy." -- jhkim

Tristan

4d6, drop the lowest, arrange to taste is how we mostly did it.

I know later editions have moved to arrays, and I can see the appeal in it honestly. Just old habits are hard to break.

 

jhkim

Quote from: Eirikrautha on June 21, 2021, 09:33:52 PM
Quote from: jhkim on June 21, 2021, 04:31:36 PM
That wasn't my experience with AD&D first edition. I always felt like the AD&D1 ability score rolls were extremely important - moreso than later editions. Later editions smoothed out the bonuses so there was more of an incremental change as you got better, and removed the level limits and experience bonus based on stats. Among groups I played, getting an 18 in your primary stat was a big deal -- much bigger than in later editions when 18 is just a minor incremental change from 16. Most of the attributes charts had sharp changes in the 15-18 range, and further, high stats let you be limited character types like a ranger or paladin that weren't available at all otherwise.
Please explain how, to a non-fighter, the difference between an 8 or an 18 is a "big deal."  Explain how it has " sharp changes in the 15-18 range."
Or how a 14 in con is so much better than a 7.  Or how a non-fighter experiences a "sharp change" in bonuses above 14.  You must have been using a very different edition of AD&D than the rest of us...

I can explain the sharp changes I'm talking about. With lucky rolls you'll get an 18 in your top/primary stat, with unlucky rolls you'll still get at least a 14 or so. The question is, how much difference does that luck make. For example, in AD&D1, the Dex bonuses are like this:

Dex 14: +0AC, Dex 15: -1AC, Dex 16: -2AC, Dex 17: -3AC, Dex 18: -4AC

In 3rd-5th edition, the bonuses are:

Dex 14: -2AC, Dex 15: -2AC, Dex 16: -3AC, Dex 17: -3AC, Dex 18: -4AC

So even if you have only a 14 in your thief's dexterity, you still get a +2 bonus to missiles and AC. In AD&D1, you get nothing with a 14. The range of 14 to 18 has a much bigger change.

The contrast is a bit less in Strength, but still notable.

Str 14: +0 , Str 18/51: +2 hit, +3 damage

Later editions:

Str 14: +2 , Str 18: +4

If you only have a 14 as your top stat in later editions, the difference is relatively minor: just a +-2 difference compared to 18. In AD&D1, with high stats you will not only have greater change of bonus/benefit -- there's also the +10% experience bonus and possibly more class options like druid, illusionist, ranger, and paladin.

Eirikrautha

Quote from: jhkim on June 22, 2021, 03:25:03 AM
Quote from: Eirikrautha on June 21, 2021, 09:33:52 PM
Quote from: jhkim on June 21, 2021, 04:31:36 PM
That wasn't my experience with AD&D first edition. I always felt like the AD&D1 ability score rolls were extremely important - moreso than later editions. Later editions smoothed out the bonuses so there was more of an incremental change as you got better, and removed the level limits and experience bonus based on stats. Among groups I played, getting an 18 in your primary stat was a big deal -- much bigger than in later editions when 18 is just a minor incremental change from 16. Most of the attributes charts had sharp changes in the 15-18 range, and further, high stats let you be limited character types like a ranger or paladin that weren't available at all otherwise.
Please explain how, to a non-fighter, the difference between an 8 or an 18 is a "big deal."  Explain how it has " sharp changes in the 15-18 range."
Or how a 14 in con is so much better than a 7.  Or how a non-fighter experiences a "sharp change" in bonuses above 14.  You must have been using a very different edition of AD&D than the rest of us...

I can explain the sharp changes I'm talking about. With lucky rolls you'll get an 18 in your top/primary stat, with unlucky rolls you'll still get at least a 14 or so. The question is, how much difference does that luck make. For example, in AD&D1, the Dex bonuses are like this:

Dex 14: +0AC, Dex 15: -1AC, Dex 16: -2AC, Dex 17: -3AC, Dex 18: -4AC

In 3rd-5th edition, the bonuses are:

Dex 14: -2AC, Dex 15: -2AC, Dex 16: -3AC, Dex 17: -3AC, Dex 18: -4AC

So even if you have only a 14 in your thief's dexterity, you still get a +2 bonus to missiles and AC. In AD&D1, you get nothing with a 14. The range of 14 to 18 has a much bigger change.

The contrast is a bit less in Strength, but still notable.

Str 14: +0 , Str 18/51: +2 hit, +3 damage

Later editions:

Str 14: +2 , Str 18: +4

If you only have a 14 as your top stat in later editions, the difference is relatively minor: just a +-2 difference compared to 18. In AD&D1, with high stats you will not only have greater change of bonus/benefit -- there's also the +10% experience bonus and possibly more class options like druid, illusionist, ranger, and paladin.
You are ignoring the earlier part where I noted that in modern versions your attributes increase via experience.  So the real change in a modern iteration is from 16 (+3) to 20 (+5) in your primary (usually by 8th level).  Plus another couple.of points in a different stat later.  In early editions stats didn't change, except due to magic (which was DM dependent).  Your 18 lasted your entire career, only modified by the magic you could find.  In fact, many magic items that were valuable in early editions have become low-level or secondary items in later editions.  A belt of hill giant strength?  It was an upgrade for a fighter throughout his career in AD&D.  Now it's useless for one by 6th level.  A headband of intellect?  By 8th your Mage can give it away.  Your stats have become more important, as they are a larger part of your bonuses, as opposed to earlier editions, where your bonuses were mostly from the magic you found adventuring.  5e's proficiency bonus just exacerbates this even more.  Despite "bounded accuracy," your 6th level fighter can be +8 to hit and +5 to damage without a single magic item.  Show me how that is possible in 1st edition.  The stat increases via leveling are far more important in present editions than in early ones.
"Testosterone levels vary widely among women, just like other secondary sex characteristics like breast size or body hair. If you eliminate anyone with elevated testosterone, it's like eliminating athletes because their boobs aren't big enough or because they're too hairy." -- jhkim

Pat

Quote from: Eirikrautha on June 21, 2021, 11:08:30 PM

I highlighted the part where it didn't matter for you.
That sentence literally makes no sense, but I don't suppose it matters. The primary was in the previous sentence, and you were pretending it didn't apply to later sentences. Sentences don't work like that. They build on each other.

We're talking about bonuses in AD&D. You're saying a +1 doesn't matter. That's false. A +1 in AD&D usually matters significantly more than in later games, like 3e. Bonuses are harder to acquire, and on a smaller scale, so each matters more.

Also, bonuses in AD&D are highly skewed toward the ends of the probability distribution. There's little difference between a 7 and a 14, but each point from 15 to 18 typically gives a significant bonus. This contrasts with Basic D&D, where the breakpoints are 13, 16, and 18. With 3d6 in order, a Basic character will typically have one or two bonuses. An AD&D character using the same method can expect zero. That makes the bonuses rarer in AD&D, further increasing their significance. The same applies to OD&D (13 is generally the only breakpoint), and later editions that give a +1 for ever 2 points in a stat.

You're also underestimating the impact of Strength. It's true that percentile strength is even more highly skewed than other stats, because the bonuses are thin in the typical 15-18 range, and you need to roll an 18 plus be a fighter and then roll percentile strength. But if you manage to do so, it's not just a +1 or a +2 to hit. It's also a +3 to +5 (ignoring the 00 option, as you did) to damage. Given the typical weapon does somewhere between 2.5 and 5.5 damage (S-M), that's a huge increase in damage potential. Especially when combined with multiple attacks, thrown weapons, and so on. Even an 18/01 can more than double a character's damage output, and that's not even including the increased likelihood to hit. There's a reason the double dart specialist fighter was a trope.

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

Eirikrautha

Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
"Testosterone levels vary widely among women, just like other secondary sex characteristics like breast size or body hair. If you eliminate anyone with elevated testosterone, it's like eliminating athletes because their boobs aren't big enough or because they're too hairy." -- jhkim

Pat

Quote from: Eirikrautha on June 22, 2021, 06:43:44 AM
Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
You've already proved you don't understand statistics in the other thread, you don't have to keep showing off that lack of ability.

Yes, reducing your odds by a factor of 5 is a greater improvement than reducing your odds by a factor of 3. There are situations where absolute number matters, for instance the number of people affected across an entire population, but when discussing risk avoidance at an individual level the goal is to avoid the problem entirely. As a result, reducing the risk to a very low level matters. If you have a 60% chance of dying and reduce it to 40%, that's an improvement. But you're probably going to die quickly anyway after just one or two checks. But if you reduce the change from 5% to 1%, then you'll typically survive 5 times as long. That's a huge improvement.

HappyDaze

Quote from: Pat on June 22, 2021, 07:15:22 AM
Quote from: Eirikrautha on June 22, 2021, 06:43:44 AM
Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
You've already proved you don't understand statistics in the other thread, you don't have to keep showing off that lack of ability.

Yes, reducing your odds by a factor of 5 is a greater improvement than reducing your odds by a factor of 3. There are situations where absolute number matters, for instance the number of people affected across an entire population, but when discussing risk avoidance at an individual level the goal is to avoid the problem entirely. As a result, reducing the risk to a very low level matters. If you have a 60% chance of dying and reduce it to 40%, that's an improvement. But you're probably going to die quickly anyway after just one or two checks. But if you reduce the change from 5% to 1%, then you'll typically survive 5 times as long. That's a huge improvement.
I think the difference is that you're looking at it from the POV of an individual and how it impacts him/her, while the other asshole is looking at it in terms of how it impacts a population. Both are relevant, but the POV has to be considered.

Wrath of God

The longer I play they more I randomize even characters in systems that do not support random build at all.
"Never compromise. Not even in the face of Armageddon."

"And I will strike down upon thee
With great vengeance and furious anger"


"Molti Nemici, Molto Onore"

Eirikrautha

Quote from: Pat on June 22, 2021, 07:15:22 AM
Quote from: Eirikrautha on June 22, 2021, 06:43:44 AM
Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
You've already proved you don't understand statistics in the other thread, you don't have to keep showing off that lack of ability.

Yes, reducing your odds by a factor of 5 is a greater improvement than reducing your odds by a factor of 3. There are situations where absolute number matters, for instance the number of people affected across an entire population, but when discussing risk avoidance at an individual level the goal is to avoid the problem entirely. As a result, reducing the risk to a very low level matters. If you have a 60% chance of dying and reduce it to 40%, that's an improvement. But you're probably going to die quickly anyway after just one or two checks. But if you reduce the change from 5% to 1%, then you'll typically survive 5 times as long. That's a huge improvement.
So, let's say you have two saves Wis and Con that occur about equally in the game.  You have one "point" to spend to improve your saves.  You can improve your Wisdom from a 60% chance of success to an 80% chance of success, or you can improve your Con save from a 92% chance to a 98% chance.  Which do you spend you point on?  That's not a "population" level choice, btw (re Happyderp).  Tell me that the character will benefit more from spending the point on Con.  The relative change in odds are irrelevant next to the actual change in performance.  That is the proper application of statistics in this case.
"Testosterone levels vary widely among women, just like other secondary sex characteristics like breast size or body hair. If you eliminate anyone with elevated testosterone, it's like eliminating athletes because their boobs aren't big enough or because they're too hairy." -- jhkim

HappyDaze

Quote from: Eirikrautha on June 22, 2021, 08:35:23 AM
Quote from: Pat on June 22, 2021, 07:15:22 AM
Quote from: Eirikrautha on June 22, 2021, 06:43:44 AM
Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
You've already proved you don't understand statistics in the other thread, you don't have to keep showing off that lack of ability.

Yes, reducing your odds by a factor of 5 is a greater improvement than reducing your odds by a factor of 3. There are situations where absolute number matters, for instance the number of people affected across an entire population, but when discussing risk avoidance at an individual level the goal is to avoid the problem entirely. As a result, reducing the risk to a very low level matters. If you have a 60% chance of dying and reduce it to 40%, that's an improvement. But you're probably going to die quickly anyway after just one or two checks. But if you reduce the change from 5% to 1%, then you'll typically survive 5 times as long. That's a huge improvement.
So, let's say you have two saves Wis and Con that occur about equally in the game.  You have one "point" to spend to improve your saves.  You can improve your Wisdom from a 60% chance of success to an 80% chance of success, or you can improve your Con save from a 92% chance to a 98% chance.  Which do you spend you point on?  That's not a "population" level choice, btw (re Happyderp).  Tell me that the character will benefit more from spending the point on Con.  The relative change in odds are irrelevant next to the actual change in performance.  That is the proper application of statistics in this case.
We were talking about one variable (Save), you're setting up comparatives between two. That's literally apples to oranges, and there's no reason you only have "one point" to split between them like you set up in your non-sense response.

When talking about one variable, one guy goes from a 2/5 failure rate to a 1/5 failure rate while the second goes from 5/100 to 1/100. The first cut his chances of failure by a factor of 2 while the second cut it by a factor of 5.

Chris24601

Quote from: Eirikrautha on June 22, 2021, 08:35:23 AM
Quote from: Pat on June 22, 2021, 07:15:22 AM
Quote from: Eirikrautha on June 22, 2021, 06:43:44 AM
Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
You've already proved you don't understand statistics in the other thread, you don't have to keep showing off that lack of ability.

Yes, reducing your odds by a factor of 5 is a greater improvement than reducing your odds by a factor of 3. There are situations where absolute number matters, for instance the number of people affected across an entire population, but when discussing risk avoidance at an individual level the goal is to avoid the problem entirely. As a result, reducing the risk to a very low level matters. If you have a 60% chance of dying and reduce it to 40%, that's an improvement. But you're probably going to die quickly anyway after just one or two checks. But if you reduce the change from 5% to 1%, then you'll typically survive 5 times as long. That's a huge improvement.
So, let's say you have two saves Wis and Con that occur about equally in the game.  You have one "point" to spend to improve your saves.  You can improve your Wisdom from a 60% chance of success to an 80% chance of success, or you can improve your Con save from a 92% chance to a 98% chance.  Which do you spend you point on?  That's not a "population" level choice, btw (re Happyderp).  Tell me that the character will benefit more from spending the point on Con.  The relative change in odds are irrelevant next to the actual change in performance.  That is the proper application of statistics in this case.
I'd bump Con. Better to be virtually immune to a category that can include "death" as an effect than have significant windows (20% failure rate means failure is still going to happen every 5 or so checks) for both "mind-control" and "death" to hit you.

Its also easier to take proactive steps to avoid one weakness than have to take steps to mitigate two weaknesses. Its easier to avoid eye contact with a mesmerist than it is to stop the poison on an arrow AND still have to worry about the mesmerist.

Pat is absolutely right on the statistics. 92% means 1 in about 12.5 rolls will be a failure, while 98% jumps it to only 1 in 50 is a failure.

Pat

Quote from: Eirikrautha on June 22, 2021, 08:35:23 AM
Quote from: Pat on June 22, 2021, 07:15:22 AM
Quote from: Eirikrautha on June 22, 2021, 06:43:44 AM
Quote from: Pat on June 22, 2021, 06:04:25 AM

Finally, you're talking about single digit increases in the odds for Con. That's misleading, because while it's true in the absolute sense, what matters is the relative improvement. The percentiles you roll against using Con are system shock and resurrection survival. These aren't rolling to see if you get a bonus, but rolling to see if you avoid a something very bad, like dying or dying permanently. So when your odds increase from 94% to 99%, you've literally reduced the chance of dying by a factor of 5. Saying it's just a single digit improvement is pretending that's not the case.

So, according to you, an increase from 94% to 99% is better than an increase of 70% to 90% (from lower score increases), because of the relative odds increase (6x vs 3x)?  Do you work for the CDC?

Going from a 30% chance of dying to a 10% chance is better than going from a 6% chance to a 1%.  The percent decrease of the odds is irrelevant next to the absolute chance off the occurrence.  If, by spending $1 million we could reduce the fatality of one disease from 30% to 10% or another disease with the same number of cases from 6% to 1%, you would advocate for the second option because of the relative change in odds?  Which would lower the number of fatalities more?

Yours is a textbook example of how to distort statistics to try to reach a desired result.  Ask any player (or statistician) which will increase your success rate more: to increase your odds from 7 in ten to 9 in ten or from 18 in 20 to 19 in 20.  They'll pick the former every time.
You've already proved you don't understand statistics in the other thread, you don't have to keep showing off that lack of ability.

Yes, reducing your odds by a factor of 5 is a greater improvement than reducing your odds by a factor of 3. There are situations where absolute number matters, for instance the number of people affected across an entire population, but when discussing risk avoidance at an individual level the goal is to avoid the problem entirely. As a result, reducing the risk to a very low level matters. If you have a 60% chance of dying and reduce it to 40%, that's an improvement. But you're probably going to die quickly anyway after just one or two checks. But if you reduce the change from 5% to 1%, then you'll typically survive 5 times as long. That's a huge improvement.
So, let's say you have two saves Wis and Con that occur about equally in the game.  You have one "point" to spend to improve your saves.  You can improve your Wisdom from a 60% chance of success to an 80% chance of success, or you can improve your Con save from a 92% chance to a 98% chance.  Which do you spend you point on?  That's not a "population" level choice, btw (re Happyderp).  Tell me that the character will benefit more from spending the point on Con.  The relative change in odds are irrelevant next to the actual change in performance.  That is the proper application of statistics in this case.
Wis and Con aren't interchangeable. They do completely different things and benefit different characters in different ways.

If you make system shock saves with a 60% chance, or a 40% chance, you're probably dead on the first or second roll (your chance of surviving two rolls is 36% and 16%, respectively). Even an 80% chance only has an expected survival time of just a hair over 3 rolls (51.2%). There's a much bigger difference as you get close to the edges of the probability range, because a 92% chance can be expected to survive 8 or 9 rolls, and a 98% can be expected to survive 34 to 35.


jhkim

Quote from: Eirikrautha on June 22, 2021, 06:03:39 AM
You are ignoring the earlier part where I noted that in modern versions your attributes increase via experience.  So the real change in a modern iteration is from 16 (+3) to 20 (+5) in your primary (usually by 8th level).  Plus another couple.of points in a different stat later.  In early editions stats didn't change, except due to magic (which was DM dependent).  Your 18 lasted your entire career, only modified by the magic you could find.

I'm not sure what we're arguing here. I agree with this. Because your stats were harder to change in AD&D1, it made stat rolls much more important. If a player only rolled max 14 instead of 18, they were stuck with that forever. And they got 10% less experience to boot.

Quote from: Eirikrautha on June 22, 2021, 06:03:39 AM
Your stats have become more important, as they are a larger part of your bonuses, as opposed to earlier editions, where your bonuses were mostly from the magic you found adventuring.  5e's proficiency bonus just exacerbates this even more.  Despite "bounded accuracy," your 6th level fighter can be +8 to hit and +5 to damage without a single magic item.  Show me how that is possible in 1st edition.

This is part is mathematical obfuscation. In AD&D1, a fighter's hit chances increase an average of +1 per level (technically +2 every two levels). This expressed as a changed target number in the to-hit table -- but that has exactly the same effect as a bonus. That's a much greater increase than 5e's proficiency bonus. In 5e, proficiency increases only +1 every 4 levels.

An argument could be that by expressing proficiency as a "bonus" instead of the to-hit table and non-proficiency penalty, it increases its psychological importance to the players - but that still is unrelated to attribute scores.

I agree that earlier editions emphasized hoards of magic items much more than 5e, but that's a different issue than the changes to the stat-related rules.