Hi - I'm working on a design that hopefully will hit mass production (millions of units/year). To save cost, it'd be nice to use a high input offset op-amp. This will introduce an error in some analog readings. Can I calibrate out this error at the factory? The temperature coefficient of the offset voltage is not bad at all. I'm just worried that for whatever reason this offset will drift over the life of the part.
edit: also, can you count on offset error always having the same polarity? (ie all chips from one manufacturer would have +offset error)