In physics, it’s common to develop a formula and then stick a constant to explain the unknown. For example, Newton’s theory of gravity uses the gravitational constant G on the formula `F = G * m_1 * m_2 / r^2`

, later on Einstein gave a more accurate explanation with the theory of relativity which does not rely on a constant `E = m * c^2`

. Constants provide a good enough explanation of the laws of physics that’s useful for centuries.

I was wondering what’s the equivalent in social studies? How do researchers deal with the uncertainty of human behaviour?

Edit: Comments made me remember how much I don’t understand the theory of relativity, terrible example, sorry for the confusion. I need to rephrase the question but I don’t know how.

I am looking for “glue” concepts, things that help connect observations with theory, aka if I calculate `m_1 * m_2 / r^2`

the result is slightly off but if I account for G, an empirical constant derived from observation, then everything makes sense for the observable universe.

Also, as someone said, I am referring to social studies.

I believe you mean social sciences. They do that by indicating statistical significance or deviation.

Humanities, like philosophy, languages and arts don’t use formulas all that much to describe their work, but it could probably be done using a similar statistical approach for some things.

`E = mc^2`

is not an equation relating to gravity in the way you imply, that’d be the Einstein Field Equation [1], which still depend on`G`

. And as far as we know,`c`

is also a constant.Then I’d guess a bunch of statistical constants probably show up in the humanities all the time. Is that what you are looking for, or some closed form expression with the constant?

[1] https://en.m.wikipedia.org/wiki/Einstein_field_equations

C is still a constant, no?

Look, Mr Hari Seldon, you’re going to need to work these out on your own.

TIL https://en.m.wikipedia.org/wiki/Hari_Seldon

Seldon develops psychohistory, an algorithmic science that allows him to predict the future in probabilistic terms. On the basis of his psychohistory he is able to predict the eventual fall of the Galactic Empire and to develop a means to shorten the millennia of chaos to follow. The significance of his discoveries lies behind his nickname “Raven” Seldon.

TIL??

Oh man, you are lucky. You have the chance to read the Foundation series for the first time. I’m kinda jealous and highly recommend it.

Death and taxes? 🤔

You reminded me of this exchange between Robert Cousins and Andrew Gelman:

Our [particle physicists’] problems and the way we approach them are quite different from some other fields of science, especially social science. As one example, I think I recall reading that you do not mind adding a parameter to your model, whereas adding (certain) parameters to our models means adding a new force of nature (!) and a Nobel Prize if true. As another example, a number of statistics papers talk about how silly it is to claim a 10^{⁻4} departure from 0.5 for a binomial parameter (ESP examples, etc), using it as a classic example of the difference between nominal (probably mismeasured) statistical significance and practical significance. In contrast, when I was a grad student, a famous experiment in our field measured a 10^{⁻4} departure from 0.5 with an uncertainty of 10% of itself, i.e., with an uncertainty of 10^{⁻5}. (Yes, the order or 10^10 Bernoulli trials—counting electrons being scattered left or right.) This led quickly to a Nobel Prize for Steven Weinberg et al., whose model (now “Standard”) had predicted the effect.

I replied:

This interests me in part because I am a former physicist myself. I have done work in physics and in statistics, and I think the principles of statistics that I have applied to social science, also apply to physical sciences. Regarding the discussion of Bem’s experiment, what I said was not that an effect of 0.0001 is unimportant, but rather that if you were to really believe Bem’s claims, there could be effects of +0.0001 in some settings, -0.002 in others, etc. If this is interesting, fine: I’m not a psychologist. One of the key mistakes of Bem and others like him is to suppose that, even if they happen to have discovered an effect in some scenario, there is no reason to suppose this represents some sort of universal truth. Humans differ from each other in a way that elementary particles to not.

And Cousins replied:

Indeed in the binomial experiment I mentioned, controlling unknown systematic effects to the level of 10^{-5}, so that what they were measuring (a constant of nature called the Weinberg angle, now called the weak mixing angle) was what they intended to measure, was a heroic effort by the experimentalists.

I think gravity and light work the same on psych majors as it does physics and engineering students…

I kid.

So in biology I know e the Euler number is important. It is used in growth equations (from finance to physics as well).

Statistics is fucking huge in every field. That is how you measure uncertainty. Bell curves and the Five Numbers and all that stuff is how you analysis thousands of widgets coming off an assembly line, or measurements in the social sciences field.