Yesterday I noted that I’m planning on making dragons a lot more common in our AD&D game than I’ve made them in the past (read: virtually unused) and how having uberpowerful “name” dragons doesn’t have to mean that all dragons are uberpowerful.
While I’m basically onboard with dragons as they’re written in the AD&D Monster Manual, one thing that’s always bugged me a bit is the fact that the damage from their claw/claw/bite routine is the same across all age brackets and sizes. Regardless of what power level you want your dragons to be, the idea that a monster with very specific age and size categories causes the same melee damage regardless of category is more than a little disappointing.
I’m not the only one to be bugged a bit by this, of course, and an article called ‘Dragon Damage Revised’ in Dragon Magazine #98 by Leonard Carpenter had a very good and detailed system to fix it. Each dragon was given three sets of damage numbers for each of three age ranges, and this was further expanded to cover each of the three size categories for a total of nine sets of claw/claw/bite per dragon. I intended to use the numbers from the article in play, but I’m not sure that I ever actually did due to the extreme rarity of dragons in my campaigns.
However, I’ve long since embraced the “less is more” approach to gaming, and, even though it’s not easy to reconcile this viewpoint with my decision to go with AD&D, I’m trying to keep my tinkering and optional rules to a minimum. When I do make a change, I try to keep it as simple as possible.
So here’s how I’m going to handle dragon damage. The goal is to tone down the bite damage at the immature age (below Adult) brackets while tweaking up the claw damage in the mature age brackets. If the amount of adjustment between the two was about equal, I would have just as happily left them both alone and called it even. However, after spending some time playing with the numbers I decided that it was worth making tweaks to each.
So here is what I’m planning to do:
A couple of notes:
First, notice that in addition to the adjustments for age there is an adjustment for huge-sized dragons. It’s not a lot, but it’s a little bit of a kicker for the biggest of the dragons besides the extra HD and associated increase to breath weapon damage.
Secondly, it looks like the young adult category could easily have a small positive modifier for claws and a small negative modifier for the bite. In fact, when I finally settled on my numbers there was a +1 per claw attack and a -2 per bite. However, the minimalist in me pointed out that two +1s and one -2 pretty much cancel each other out, so I dropped both.
If I really want to differentiate while sticking to this minimal table, I could round the half damage bites down for very young dragons and up for young. But the point is to get dragon damage to be a little more reasonable without jacking anything up or requiring a lot of external look-ups. This could easily be penciled into the MM next to the age categories.
I’ve rarely seen dragons used, either as a player or as DM, and I think that’s a shame. From what I see online, it appears that many players have had similar experiences. A lot of it, I think, has to do with how they’ve been pumped up over the years into truly horrendous monsters who can wipe out everyone before you can say “TPK.” And this pumping up has a lot to do with how they’re portrayed in literature and film as the biggest, baddest monsters around.
No one wants Smaug to be killed by a party of well-equipped mid-level adventurers, of course, as a huge ancient red dragon could be in 1e. This is part of why, back in the day, I was strongly in favor of pumping up dragons significantly…the first time I ever placed a dragon in a dungeon, it was defeated in a couple of rounds. It was a blue dragon, I fudged hit points and dice rolls to make it tougher, and it was still killed rather quickly with only moderate harm to the party.
But I now believe that, rather than turning all dragons into supermonsters who are a serious threat to even the most powerful adventurers, the solution to the ‘Smaug Problem’ is not to beef up dragons across the board. Rather, it’s to decide that “name” dragons like Smaug and other legendary wyrms are exceptional examples of dragons, with their own special increases or extra abilities, but that most dragons conform to those detailed in the Monster Manual.
Declare Smaug a huge ancient red dragon with extra hit dice, a better AC (remember the gems embedded in his scales?), the ability to use his breath weapon five or six times per day instead of three, and a +2 on all saving throws. Just don’t do it for all huge ancient red dragons, scaling all other dragons upward similarly, because you need a Smaug-like dragon in your game to be a truly fearsome opponent. This will have a couple of benefits.
First, the special “name” dragons will seem a lot more exceptional. Smaug won’t be just an ancient huge red dragon, he’ll be “Smaug” and will have special abilities to go with the name.
Secondly, it will keep non-name dragons normal. And this will allow them to be placed into dungeons and the wilderness much more frequently without killing off every last living thing in a five mile radius. They’re tough, for sure, and at the apex of the predator pyramid. But they won’t throw the entire game out of whack if their numbers are bumped up a bit.
Dragons in the original D&D game were quite weak, even compared to the 1e AD&D versions. And, I’m guessing, not nearly so uncommon as they weren’t the thermonuclear supermonster that they became in later editions.
Who wouldn’t like to see more dragons in the game?
UPDATE: All this said, I do think the claw/claw/bite numbers could stand a little fiddling. I will have a follow-up post on how I’m going to do that.
One of the things I’ve found the most tedious about calculating experience in AD&D is the per hit point part of the creature’s value. I understand why a creature with more hit points is worth more than one of its fellows with fewer, but what a pain. A huge deal? No. But I don’t think it adds enough to the game to be worth it.
So I’ve decided I’m going to shortcut it. Rather than doing the calculation for every individual monster, I’m going to calculate out the value of one of each with average (4.5 per HD) hit points and use that for every copy encountered. I roll out HD normally, so the results should be perfectly fine.
For example, three gnolls I just rolled have 11, 6, and 11 hit points. The listed XP value is 28 +2/hp, so they should be worth 50, 40, and 50 XP respectively, a total of 140. Using my method we have 2 XP per hit point TIMES 4.5 hit points per HD TIMES 2 HD PLUS 28 EQUALS 46 XP per gnoll for a total of 138.
Another example: Gray ooze (3+3 HD) is 200 + 5/hp by the book. The gray ooze I just rolled has 16 hit points, so it is worth 280 XP. My method is 5 XP per hit point TIMES 4.5 hit points per HD TIMES 3 HD PLUS 15 (for the three extra hp at 5 per) PLUS 200 EQUALS 282.5 or 283 XP.
I’m making a list of monsters as I use them and adding the value to Appendix E in the DMG, though I might write them into the Monster Manual as well. This is an example of what I mean when I say we’re trying to play “mostly-by-the-book”: We don’t want to change anything if we can possibly help it, and when we change or houserule something it’s going to be with as little distance from BTB as we can manage.
Note: Sometimes, such as with the gray ooze example above, you end up with a half XP. I always round this UP in favor of the players. But then I’m a softie pushover DM like that.
I’ve never really cared for the fact that AD&D figures that 60% of dungeon rooms are empty, at least according to Appendix A: Random Dungeon Generation (DMG, pg 170-172). Sure, I could make my own table or use the percentages from B/X (my personal favorite), but I’m trying to play a mostly-BTB 1e game, and that means only using house rules or rules from other editions only when absolutely necessary.
So it dawned upon me tonight while rolling up some room contents that if the result “1-12 Empty” is re-rolled when it comes up and the second roll it kept, it comes out to about what I desire.
Here is how Table V. F.: Chamber or Room Contents (DMG, pg 171) looks:
This has roughly the same number of empty rooms as those populated with monsters (36% vs. 40%). Seems about right to me and I didn’t have to mess with anything. Plus, it knocks down the number of traps, which is just fine with me.
This is something that’s been on my mind for a long, long time. Though I don’t actually intend to use it in our game, I wanted to write it up and get it out there for some feedback.
I think it’s no secret that a lot of folks have various issues with the cleric class. Despite the fact that my first-ever PC was a cleric (who killed a vampire in his first adventure) the class has never set real well with me for a number of reasons. First of all, I’m not particularly interested in the mythilogical religious aspects of the class; my games are generally fairly light on such things. Secondly, the real place for the class in the game seems sort of up in the air; some see clerics as medics while others see them as undead specialists. I’ve always looked at them as mystic warriors akin to Jedi Knights, but even that is a stretch, particularly considering the class’s weapons restrictions. Finally, I believe the experience point requirements for the class are outlandishly low given the cleric’s capabilities.
Several years back, helped by the lack of variable weapon damage in Swords & Wizardry White Box, I planned to remove the restriction on edged weapons and make clerics the mystic warriors I’d always envisioned them to be. Unfortunately, our S&W:WB game petered out and the opportunity was lost.
Now we’re playing 1e AD&D and not only is the cleric still a problem in my mind, it’s been compounded by the more-than-slightly redundant paladin. My second PC, for what it’s worth, was a paladin, but I’ve never really been a big fan of the class.
So I’ve pondered a solution that not only removes the redundancy but addresses what I dislike about the cleric: combine the two classes into one new class that mostly covers what the two original classes stand for. In a move to further distance the religious connections of the cleric class, I’ve decided to call the new class paladin. Besides, there is lot more historical precedent for the name than the standard cleric and it just sounds cooler.
Anyway, here is a draft of what I’ve come up with. I’ve written it up in 1e format and style as if it were in the PHB and I’ve left the religious aspects intact as I know most games make much more use of that sort of thing than ours does.
It retains much of what the original paladin has, and I hope it hits a good middle ground that will be potentially useful to some who dislike the cleric. My thinking is that, in most cases, anywhere it says “cleric” in the books should be read as “paladin,” with most of the weapons and items for fighters also available for this new class. (I’m sure that there are conflicts that I’ve not thought of and I’d appreciate anyone pointing them out.)
What I’m really uncertain about is the XP requirement scale, but any feedback will be very welcome. I’ve stuck with the standard paladin scale for now, with HD and most paladin abilities knocked down a bit but spell casting added at 3rd level. I think it’s probably in the right ballpark, anyway, but would not hesitate to tweak it if good reasons were given.
As I’ve said, we aren’t actually planning to run this class in our game; I’m trying hard to run a mostly-BtB AD&D game these days and though I’d love to incorporate this class, it just doesn’t fit in with what I’m after.
UPDATE: Due to conflicting file names, I think some people were getting only a link to the PDF and missing the blog post. I’ve fixed that now, and apologize for the confusion.
I wasn’t going to do this, but decided what the heck. Our game is basic 3 book AD&D.
- Ability scores generation method?
- 4d6 drop lowest, arrange as desired OR
- 3d6 12 times, keep 6 best, arrange as desired
- Player chooses before rolling anything
- 0 hp to -CON gets a save vs. death to be unconscious
- Otherwise dead
- As per spells or devices and VERY rare
- Depends, but usually met next time into town or on the road
- Group: 1d6 per side
- High roll wins, tie goes to the previous winner (momentum)
- Not as a standard part of the combat system.
- Only in certain circumstances
- Yes, if you miss your target
- You will need to run from many encounters
- Oh, yeah.
- Lots of them
- Save-or-die situations will almost always be obviously dangerous
- Except poison
- Moderately strict encumberance
- Pretty strict ammunition and water
- Very strict food and torches
- Standard AD&D GP expense per DMG just “vanishes” from PC’s account
- Training is hand-waved but assumed to be ongoing
- One random automatic new spell when a new spell level is available
- Minimal downtime required…could be camp in the middle of an adventure
- Money treasure (Coins, gems, jewelry, expensive art, precious metal items)
- Combat (Enemies killed or totally defeated)
- No XP for magic items or value of mundane items
- Sometimes an “adventure reward” for successful completion of goal-oriented adventures, but this is never very big
- Simple traps can be found by anyone looking
- More advanced traps can be found x% of the time by looking
- Some traps can only be found by skilled PCs (thieves, dwarves for stone traps, etc)
- Thief or racial bonuses add to x% for non-skilled searchers
- Yes, henchmen/retainers/hirelings are strongly encouraged
- Morale/loyalty works as per AD&D rules except in exceptional cases
- Usually have to be tried out
- Tasting a potion tells taster what it is
- Identify spell can assist with items
- Basic items (+1 sword, etc) IDed by DM after some use
- Weird items or special powers sometimes unknown for long periods of time
- Still no
- Get to that point and we’ll work it out
- Don’t prefer it for the logistical reasons, but allowed
The topic of level limits for demi-human characters seems to be making the rounds, and I see that I’ve not really ever weighed in on it except in my Roll To Advance system of alternative XP/advancement rules, so I’ll toss in my two cents here.
I’ve personally never been a big fan of racial level limits, and for a long period of time our method was to allow demi-human advancement past the listed maximum by doubling XP requirements from that point on.
Though I don’t like racial level limits, I do think that humans get shafted by the game as written and believe that there should be something to offset the pretty significant bonuses that demi-humans get. I might be biased, being a human who prefers to play human PCs, but I prefer a human-centric game. And as most of our games don’t reach the point where level limits become an issue, the balancing “cap” never really does a lot to balance our games.
If I were to monkey with the system these days, my approach would be to tweak the XP requirements for demi-humans, requiring elves to pay X% more per level, for instance, to offset their increased capabilities. I don’t know exactly what the numbers would be for each race, but it would be something noticeable but not overwhelming. Enough to make it a real difference from session one but not enough to force everyone to play a human.
Our current game is by-the-book AD&D so I’m fighting the urge to tinker with this, but if I was going to it would be via XP requirement adjustments in place of level caps or giving humans some sort of racial bonus.
A few months back, I wrote about our plan to use Method K for ability score generation in our AD&D game. Method K was a roll 3d6 in order, swap any two system with a seventh “bonus” roll which could be swapped into the mix if desired. The idea was to keep the 3d6 range of scores and the randomness of in-order rolling, while also giving the player a small amount of control by allowing score swapping; the seventh roll also gave an extra chance for a high score or a way to avoid a really low score. But at the time I commented:
One thing with our current method is that anything besides base cleric, fighter, magic-user, or thief is VERY uncommon. That’s not a problem in my mind, but I suspect that players may be frustrated with the fact that almost no one will ever get to be rangers, druids, etc. We’ll see how it goes.
How it went was “this isn’t what we want,” and it wasn’t just the players who ended up frustrated with it; I also came to the conclusion that the method wasn’t a good fit for how we wanted our game to play. So Method K is out.
Our new plan is allow players a choice of Method I (4d6 (drop lowest) six times, arrange as desired) or Method II (3d6 twelve times, keep six scores, arrange as desired). The scores are a little higher but still not “superhero” high, and the full control over arrangement means that, if they roll well enough, players can choose whichever class they want. Though I still believe that there’s a lot to be said for the forced creativity that results from trying to make the most of a PC you wouldn’t necessarily have chosen, player satisfaction improved when we made the change and now we’ll actually see some PCs besides the four basic classes. This is AD&D, after all.
Anyway, back when we were discussing this, I made a spreadsheet to simulate 10,000 sets of scores by each of the two methods. As one would figure, Method I results in more high or low scores than Method II at the cost of a lower overall average. I came across the spreadsheet and thought I’d post it.
Here is a screenshot of the top of the spreadsheet:
Each line is a set of ability scores arranged high to low. 17s and 18s are green, 15s and 16s are yellow, and scores below 10 are red. The red numbers near the top of each column are the averages for that column, so the average fourth-highest roll on Method I is 11.8 while the average fourth-highest roll on Method II is 12.2. At the right are total counts of how many times each score was rolled and a graph comparing the two methods.
The graph tells the story, so here’s a better look at it:
There’s probably nothing here that surprises anyone, but it’s nice to see so clearly how the curve of the two methods compare.
Personally, I’d probably go for Method II every time. But I can see how gambling for a better chance at some top-end scores at the risk of a bad score or two and a lower overall average would appeal to some, especially if a base class is the goal anyway. Selling out in hopes of a 17 or 18 in a prime requisite might be worth it.