more type-stable type-inference (#41697)
(this PR is the final output of my demo at [our workshop](https://github.com/aviatesk/juliacon2021-workshop-pkgdev))
This PR eliminated much of runtime dispatches within our type inference
routine, that are reported by the following JET analysis:
```julia
using JETTest
const CC = Core.Compiler
function function_filter(@nospecialize(ft))
ft === typeof(CC.isprimitivetype) && return false
ft === typeof(CC.ismutabletype) && return false
ft === typeof(CC.isbitstype) && return false
ft === typeof(CC.widenconst) && return false
ft === typeof(CC.widenconditional) && return false
ft === typeof(CC.widenwrappedconditional) && return false
ft === typeof(CC.maybe_extract_const_bool) && return false
ft === typeof(CC.ignorelimited) && return false
return true
end
function frame_filter((; linfo) = sv)
meth = linfo.def
isa(meth, Method) || return true
return occursin("compiler/", string(meth.file))
end
report_dispatch(CC.typeinf, (CC.NativeInterpreter, CC.InferenceState); function_filter, frame_filter)
```
> on master
```
═════ 137 possible errors found ═════
...
```
> on this PR
```
═════ 51 possible errors found ═════
...
```
And it seems like this PR makes JIT slightly faster:
> on master
```julia
~/julia/julia master
❯ ./usr/bin/julia -e '@time using Plots; @time plot(rand(10,3));'
3.659865 seconds (7.19 M allocations: 497.982 MiB, 3.94% gc time, 0.39% compilation time)
2.696410 seconds (3.62 M allocations: 202.905 MiB, 7.49% gc time, 56.39% compilation time)
```
> on this PR
```julia
~/julia/julia avi/jetdemo* 7s
❯ ./usr/bin/julia -e '@time using Plots; @time plot(rand(10,3));'
3.396974 seconds (7.16 M allocations: 491.442 MiB, 4.80% gc time, 0.28% compilation time)
2.591130 seconds (3.48 M allocations: 196.026 MiB, 7.29% gc time, 56.72% compilation time)
```