When object fields are evaluated the locals from the object
are added to the environment. These locals should have the same
environment as the field, in particular they should be
at the same inheritance level. Instead they were evaluated as if
they were on the level from which original field lookup was performed,
resulting in subtle and hard to debug issues.
We didn't set the environment (upvalues) for objects
created as comprehensions - we set them for each field
separately, but that meant missing the locals.
Keep object locals only once in AST
For example this reduces the size of stdlib ast file
roughly 3x. Note that this change doesn't regenerate the stdlib,
so that the diff here is sane.
It is likely to slightly improve performance of code using
a lot of locals (~10% on bench.05.gen.jsonnet).
The desugaring is more strightforward now, and we're back
to desugaring each node exactly once.
So far `std.jsonnet` needed to be updated seprately from
the cpp-jsonnet submodule. Since we should update it anyway
at the same time (to make sure the tests are running fine),
we can just as well get it directly from there, eliminating
the extra step of copying the new `std.jsonnet` version.
This change updates the cpp-jsonnet used.
Add go mod
Also make the build docs nicer and cleanup the file directory to be in line
with most other go projects. This also make it so you can build jsonnet
without setting -o on go build.
* Minimal C bindings
* Fix version reporting in C bindings
* Apply suggestions about C bindings implementation
* Rename compat/ -> c-bindings/
* Add comment about indexing VMs in C bindings
My understanding of origin of this bug was that once we
were creating thunks for binary operator arguments. So the environment
was no longer needed once they were created, so they could be tailcalls.
Now we're calling i.evaluate directly (for performance) and now
the environment cannot be destroyed during the evaluation of the first
argument.
Current go-jsonnet is not really 0.12.1 - it even
reports 0.12.0. This brings all the good stuff
from recent cpp-jsonnet commits and actually syncs
the version.
This commit refactors travisBuild.sh to support building pushed branched on top
of PRs and tags.
This was prompted by enabling travis on a forked repository. The devlopment
flow then becomes:
- Fork google/go-jsonnet
- Enable travis on fork repository
- Push branch to forked repository
- Travis runs on the pushed branch
- Submit pull request once happy and tests pass in the forked repository
I tested 3 scenarii and they seem behave nicely:
- Running CI for pushed branch
- Running CI for tag
- Running CI for a Pull Request
As a side note, the previous error path didn't work as intented:
- The TRAVIS_PULL variable didn't exist
- We weren't exiting with a non 0 value in the else branch, so the build
succeeded instead of failing when travisBuild.sh didn't know what to do
with the pushed branch
Given the <tag> chosen in the GitHub releases page, this builds jsonnet
for darwin & linux amd64 (we can add more if needed), and uploads to
gs://jsonnet/<tag>/<os>/<arch>/jsonnet
People use these operators in tight loops, without even
thinking about it, and it's previous implementation required
multiple object lookups (std.), string comparisons (for types)
and multiple jsonnet function calls.
This change introduces builtin, efficient implementation.
It results in ~3x speedup in strContains benchmark that
Angus provided on Slack.
Additional benefit is that equals/primitiveEquals distinction
is now obsolete, which made things simpler for everyone.
* Add test support for multi-file output.
* Add -update support for multi-file output tests.
* Add support for string output in multi-file output mode.
* Rename 'stringOutput' to 'stringOutputMode' to better express what it does
* Refactor main_test to make it less nested.
This also causes the -update flag to output a list of files which
have been updated. This does not include the paths which are deleted
for multi-file tests.
Change the dump code so that it hides the values of variable
definitions if they're large. This means that godoc.org should
be able to deal with the output, and the godoc output is
readable without needing to read through a huge struct literal definition.
Other approaches might be to always generate an extra variable
(seems unnecessary) or to pass to writer explicitly to the dump
methods rather than swapping s.w out temporarily. The former
seems unnecessarily intrusive to the usual output; the latter
seemed unnecessarily intrusive to the source itself.
YMMV.