From: J. Clarke on
jmfbahciv wrote:
> J. Clarke wrote:
>> jmfbahciv wrote:
>>> J. Clarke wrote:
>>>> jmfbahciv wrote:
>>>>> J. Clarke wrote:
>>>>>> jmfbahciv wrote:
>>>>>>> Andrew Usher wrote:
>>>>>>>> Bob Myers wrote:
>>>>>>>>
>>>>>>>>>> Drills already have interchangeable bits,
>>>>>>>>> Ah, another person who's never seen the inside of
>>>>>>>>> a machine shop...
>>>>>>>> OK, perhaps I didn't use the right terminology; I used that
>>>>>>>> which I am familiar. Nevertheless, my point stands that you
>>>>>>>> don't normally need a different machine for each different size
>>>>>>>> of drilling.
>>>>>>> Now ask the question why that is so.
>>>>>> I'm not sure I see the point of this particular discussion. Most
>>>>>> drills have three-jaw chucks that don't really require much of
>>>>>> the drill bit other than that it be round and not so big that it
>>>>>> won't fit in the hole or so small that the jaws won't close on it
>>>>>> (typically about a 20:1 range). Certainly no drill press I have
>>>>>> owned or worked with has had any trouble with bits that are
>>>>>> fractional inch sizes, metric sizes, or sizes that are pretty
>>>>>> much arbitrary.
>>>>>>
>>>>>> There are machines that require bits with tapered shanks or that
>>>>>> use collets that require shanks of a specified dimension and
>>>>>> form, or that require threaded shanks, but they are relatively
>>>>>> rare--most drilling is done with the bits secured in a 3-jaw
>>>>>> chuck and 3-jaw chucks are measurement-system agnostic.
>>>>> The reason that 3-jaw chuck exists is to adapt to any system: US,
>>>>> si or Sears.
>>>> No, it's to let you use the same drill with a tiny little bit or a
>>>> great big huge bit. The other option is to make the bit with a
>>>> standard sized shank, which means that the bits will all have steps
>>>> in them, which makes them more expensive to manufacture.
>>>>
>>>>>> Now if you're dealing with very small drills, circuit board
>>>>>> drills, and the like, they do often have a standard shank
>>>>>> diameter, mainly because their small diameter would make them
>>>>>> difficult to handle otherwise (like you'd need tweezers and a
>>>>>> magnifier to change bits) and there the measurement system does
>>>>>> matter, but swapping out a collet takes seconds.
>>>>>>
>>>>> Thus, the specification of the drills included adapting to any
>>>>> size. The reason for the generic is becuase there were more than
>>>>> one flavor.
>>>> Exercise--go down to Home Depot and look at the drill bits and
>>>> think about what they would have to look like if 3-jaw chucks that
>>>> could take any size were not in widespread use. Note that there
>>>> are very small ones and very big ones and ones in between. Then
>>>> think about how such a thing would be made. Then think about why
>>>> anybody in his right mind would make them that way if there was
>>>> another option. Then tell us whether you still think that the
>>>> existence of 3-jaw chucks has anything to do with metric vs inch.
>>>>
>>>> You usually come across as a very sensible person but on this
>>>> particular issue you're way off base.
>>>>
>>>>
>>> I'm thinking about how drills changed over the years _before_
>>> electricity. I don't remember ever seeing hand drills with
>>> the option of changing the bits. Do you know the ones I'm
>>> talking about? You held the shaft with both hands and rotated;
>>> the shaft looked like a step-function graph.
>>
>> Carpenters's brace-and-bit, which used a square taper shank. So
>> happens that the two-jaw chuck for those was developed about the
>> same time as the three-jaw chuck for round bits in the late 1800s.
>> Prior to that time a square hole was used for the square taper
>> shanks and collets for the round bits. Neither had anything to do
>> with the introduction of the metric system.
>
> Kewl. Thanks. I'm haven't said that it has to do with the intro
> of the metric system. I was trying to talk about why a drill
> was designed to accept all sizes of bits. Read ^my line^ about
> adapting to any system: US, si or Sears. I was support the
> comment. Right after that someone disagreed.
>
> I think there is a passing of missed understandings here.
>
>
>>
>> Just as an example, a starter set of drill bits for someone taking up
>> machining in the US will have 115 bits ranging in diameter from
>> 0.04" up to
>> 0.5". Any drill that will take all those bits will also take any
>> metric bit from 1mm to 12.5mm. Drills smaller than 1/4 inch or so
>> with square tapered shanks are rare--you won't find any in such a
>> set no matter how old it is. What you will find is a round taper,
>> called a Morse taper, that was developed in the US in the 1860s and
>> is now subject of an ISO standard and still in widespread use. The
>> taper can be used directly to mount tooling, but more commonly is
>> used to mount a chuck.
>>
>> Further, the fact that a country is metric doesn't mean that all
>> drill bits used for machining come in even fractions of a
>> millimeter. For example to get a close clearance fit on a shaft
>> that is 10mm +/- .0001, you'd need to a hole that is between 10.013
>> and 10.055 mm diameter, or nominally 10.034mm.
>>
>> This is the nature of drilling and the reason that drills can take
>> bits of many sizes.
>>
>> By the way, power drills were in use long before electricity--they
>> ran on steam or water power.
>>
>
> Those power drills had to be stationary. How would you design a
> portable power drill that ran on water power?

You didn't, you took the work to the drill, not the drill to the work.
Things that we currently do with power drills used to be done with bow
drills that got replaced with geared hand-cranked drills, for larger holes
with a brace and bit, and for holes too big for a brace and bit with a
two-handed auger (generally the drill and bit were one piece with a hole
through the shank through which one passed a wooden handle).

Precision drilling is still for the most part done with stationary
tools--getting accurate placement and angular alignment with a hand-held
drill is difficult even with jigs.


From: Andrew Usher on
Darwin123 wrote:

> I was making an analogy between you and Adolf Hitler in terms of
> irrationality. I wasn't calling you a killer or anything.

I think that sort of analogy is a real stretch.

> My analogy is based on the fact that your objections are based
> solely on political ideology. You apparently have no idea how
> standards work in an actual machine shop, or in a laboratory. Your
> main objection is that it is an example of "white-guilt."

My objections are not solely based on politics; I think I made that
clear.

> This is the first time that i have heard "white-guilt" and
> colonialism tied to the metric system. You made vague references to
> the French Revolution. I hypothesize that you associate the French
> Revolution with the metric system. My conjecture is that since the
> French Republic degenerated into a system vaguely like Communism, you
> associate the metric system with Communism. When thI may be wrong, but
> there is no other connection between the metric system and the left
> that I can think of.

I actually praise the French revolution in general, and call myself a
socialist. You would know this if you'd read many of my posts.

Andrew Usher
From: Matt on
On Mon, 15 Feb 2010 08:06:21 -0500, jmfbahciv wrote:

>Andrew Usher wrote:
>> Matt wrote:
>>> And the Celsius temperature scale is just silly. Why throw away twice
>>> the whole-number granularity afforded by the Fahrenheit scale? Or the
>>> notion that 100 tends to suggest more of a milestone than 38 as a
>>> temperature extreme for comfort? Aren't the metric zealots gaga over
>>> powers of ten? Why not use a power of ten to describe a temperature
>>> that is extreme but survivable? Sterilizers operate near 100C. But
>>> the Celsius scale makes it easier for tabletop chemists to calibrate
>>> their thermometers.
>>
>> No, it doesn't, actually. If you want to measure the boiling of water,
>> it isn't any harder to use 212 F as 100 C - and you have to correct
>> for pressure anyway, to be accurate enough for calibration.
>>
>You obviously have not done any arithmetic.

That is simply an absurd statement.

.... have not done *any* arithmetic?!

I would find it hard to believe it to be true of anyone posting here.
I suppose some equally absurd scoffing remark could be contrived in
supposed refute of my statement. Still, I suspect that everyone
posting here has done arithmetic correctly at least once in their
life.

> Using 212 instead of 100
>is more difficult for every calculation.

Or not:

212 - 112 = 100.
Easy.

100 - 112 = -12
A negative number which may make subsequent calculations more
difficult and subject to error if the sign is dropped.

Again, "ease of calculation" is not the only consideration in the real
world. Conversational use of measured values has significance, too.

Anything below freezing is a negative number in Celsius. Not so handy
for numbers which happen often enough in mid-latitude winters.

> If you have your computer
>do it, it will be wrong.

Because ...?

> Using 100 implies that you don't have
>to do any numbers other than 1.

Is zero not a number?

So now going metric is about using fewer unique digits in a number?

Or not.

1 meter divided by 4 is 25 centimeters.

1 foot divided by 4 is 3 inches.

--
Matt
From: jmfbahciv on
Matt wrote:
> On Mon, 15 Feb 2010 08:06:21 -0500, jmfbahciv wrote:
>
>> Andrew Usher wrote:
>>> Matt wrote:
>>>> And the Celsius temperature scale is just silly. Why throw away twice
>>>> the whole-number granularity afforded by the Fahrenheit scale? Or the
>>>> notion that 100 tends to suggest more of a milestone than 38 as a
>>>> temperature extreme for comfort? Aren't the metric zealots gaga over
>>>> powers of ten? Why not use a power of ten to describe a temperature
>>>> that is extreme but survivable? Sterilizers operate near 100C. But
>>>> the Celsius scale makes it easier for tabletop chemists to calibrate
>>>> their thermometers.
>>> No, it doesn't, actually. If you want to measure the boiling of water,
>>> it isn't any harder to use 212 F as 100 C - and you have to correct
>>> for pressure anyway, to be accurate enough for calibration.
>>>
>> You obviously have not done any arithmetic.
>
> That is simply an absurd statement.
>
> ... have not done *any* arithmetic?!
>
> I would find it hard to believe it to be true of anyone posting here.
> I suppose some equally absurd scoffing remark could be contrived in
> supposed refute of my statement. Still, I suspect that everyone
> posting here has done arithmetic correctly at least once in their
> life.
>
>> Using 212 instead of 100
>> is more difficult for every calculation.
>
> Or not:
>
> 212 - 112 = 100.
> Easy.
>
> 100 - 112 = -12
> A negative number which may make subsequent calculations more
> difficult and subject to error if the sign is dropped.
>
> Again, "ease of calculation" is not the only consideration in the real
> world. Conversational use of measured values has significance, too.
>
> Anything below freezing is a negative number in Celsius. Not so handy
> for numbers which happen often enough in mid-latitude winters.
>
>> If you have your computer
>> do it, it will be wrong.
>
> Because ...?
>
>> Using 100 implies that you don't have
>> to do any numbers other than 1.
>
> Is zero not a number?
>
> So now going metric is about using fewer unique digits in a number?
>
> Or not.
>
> 1 meter divided by 4 is 25 centimeters.
>
> 1 foot divided by 4 is 3 inches.
>

Did you ever take chemistry in high school?
Or physics?
Or home economics?

/BAH
From: Matt on
On Thu, 18 Feb 2010 08:27:06 -0500, jmfbahciv wrote:

>Matt wrote:
>> On Mon, 15 Feb 2010 08:06:21 -0500, jmfbahciv wrote:
>>
>>> Andrew Usher wrote:
>>>> Matt wrote:
>>>>> And the Celsius temperature scale is just silly. Why throw away twice
>>>>> the whole-number granularity afforded by the Fahrenheit scale? Or the
>>>>> notion that 100 tends to suggest more of a milestone than 38 as a
>>>>> temperature extreme for comfort? Aren't the metric zealots gaga over
>>>>> powers of ten? Why not use a power of ten to describe a temperature
>>>>> that is extreme but survivable? Sterilizers operate near 100C. But
>>>>> the Celsius scale makes it easier for tabletop chemists to calibrate
>>>>> their thermometers.
>>>> No, it doesn't, actually. If you want to measure the boiling of water,
>>>> it isn't any harder to use 212 F as 100 C - and you have to correct
>>>> for pressure anyway, to be accurate enough for calibration.
>>>>
>>> You obviously have not done any arithmetic.
>>
>> That is simply an absurd statement.
>>
>> ... have not done *any* arithmetic?!
>>
>> I would find it hard to believe it to be true of anyone posting here.
>> I suppose some equally absurd scoffing remark could be contrived in
>> supposed refute of my statement. Still, I suspect that everyone
>> posting here has done arithmetic correctly at least once in their
>> life.
>>
>>> Using 212 instead of 100
>>> is more difficult for every calculation.
>>
>> Or not:
>>
>> 212 - 112 = 100.
>> Easy.
>>
>> 100 - 112 = -12
>> A negative number which may make subsequent calculations more
>> difficult and subject to error if the sign is dropped.
>>
>> Again, "ease of calculation" is not the only consideration in the real
>> world. Conversational use of measured values has significance, too.
>>
>> Anything below freezing is a negative number in Celsius. Not so handy
>> for numbers which happen often enough in mid-latitude winters.
>>
>>> If you have your computer
>>> do it, it will be wrong.
>>
>> Because ...?
>>
>>> Using 100 implies that you don't have
>>> to do any numbers other than 1.
>>
>> Is zero not a number?
>>
>> So now going metric is about using fewer unique digits in a number?
>>
>> Or not.
>>
>> 1 meter divided by 4 is 25 centimeters.
>>
>> 1 foot divided by 4 is 3 inches.
>>
>
>Did you ever take chemistry in high school?

Yes.

>Or physics?

Yes.

>Or home economics?

No.

You seem to be fixated on laboratory and academic environments. There
is a much larger world outside such controlled settings.

How about the importance of measuring as opposed to calculating?

Note Benford's Law:
http://en.wikipedia.org/wiki/Benford's_law
lists of numbers from many (but not all) real-life sources of
data, the leading digit is distributed in a specific,
non-uniform way.

The higher increments of those nicely spaced divisions into tenths get
little use:
According to this law, the first digit is 1 almost one third
of the time, and larger digits occur as the leading digit
with lower and lower frequency, to the point where 9
as a first digit occurs less than one time in twenty.

There are reasons why people, left to their own devices, didn't
gravitate to dividing real-world lengths into tenths.

If one is an adherent of evolution, then *why* did people with ten
fingers win out over people with fewer digits? Perhaps because it
isn't necessarily fatal to lose a finger. And now a measurement system
comes along to enshrine factors of ten for measurements which don't
lend themselves to divisions into tenths, when fingers beyond perhaps
six or eight were considered expendable by nature. How many fingers
does Homer Simpson have? Why are we so accepting of cartoon characters
having fewer than ten fingers? Are we really just giving the
cartoonist a break?

Analogies between the metric system and decimalized monetary systems
are bogus. Unlike a unit of length, units of currency have no physical
reference in nature. Controlled experiments in a chemistry lab are
somewhat analogous to monetary systems in that they, too, deal with
contrived situations. The value of an ounce of gold is a cultural
convention. The length from here to there is a physical reality
regardless of the currency in one's wallet.