added scripts to generate parsable API docs #187
No reviewers
Labels
No labels
area: devops
area: discovery
area: docs
area: livestream
area: proposal
consider soon
Epic
good first issue
hacktoberfest
hard fork
help wanted
icebox
Invalid
level: 0
level: 1
level: 2
level: 3
level: 4
needs: exploration
needs: grooming
needs: priority
needs: repro
needs: tech design
on hold
priority: blocker
priority: high
priority: low
priority: medium
resilience
soft fork
Tom's Wishlist
type: bug
type: discussion
type: improvement
type: new feature
type: refactor
type: task
type: testing
unplanned
work in progress
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: LBRYCommunity/lbrycrd#187
Loading…
Reference in a new issue
No description provided.
Delete branch "add_api_docs_scripts"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Issue: #131 . There are two scripts:
Seems like there is a mix of py2 and py3 exclusive syntax used (
print
keyword incontrib/devtools/clang-format.py
which will fail in py3 as syntax error anddef get_obj_from_dirty_text(full_object: str):
which will fail as syntax error in py2).Is this intentional? Should all code be only py2, py3 or should it work in both py2 and py3?
@eukreign , the clang-format scripts predate my time here. I just made it run as it didn't before. I don't know that anyone here wrote them, judging from the copyright messages in them. As far as I know, they have always been Py2 scripts. For this PR, the focus should be on the "docs" scripts.
Alright, it was in this PR so I thought it was related. Which files should I review as part of this PR?
I've reviewed
generate_json_api_jrgen.py
andgenerate_json_api_v1.py
. I like the Python type annotations!My only hesitation is regarding
get_obj_from_dirty_text
ingenerate_json_api_jrgen.py
, it's yuuuge! Also, that's a really big try/except; I would prefer to move that into the caller.I'm approving since I think my critique above is just preference and I think this code is fine otherwise.
@BrannonKing can this be merged? also as part of this, can you run the script (the one that matches the daemon script) and commit the output as well?
Open questions on this:
We have a request to make the output available on the web.
Following from my answer on slack about how to unit test these scripts: to be able to use the generated output as the test itself, you have to commit and version the output deliberately (not via travis) so that when the docs do change it can be verified for correctness (did the docs change because the input source changed or did they change because the generator script changed?). Therefore, my advice would be:
@ -397,0 +378,4 @@
" \"name\" (string) the name of the claim\n"
" \"value\" (string) claim metadata\n"
" \"claimId\" (string) the claimId of this claim\n"
" \"txid\" (string) the hash of the transaction which has successfully claimed this name\n"
Just curious why you're re-ordering here. Is this related to something about the doc generation?
@ -397,0 +378,4 @@
" \"name\" (string) the name of the claim\n"
" \"value\" (string) claim metadata\n"
" \"claimId\" (string) the claimId of this claim\n"
" \"txid\" (string) the hash of the transaction which has successfully claimed this name\n"
"valid at height" was between the supports array declaration and its definition. That was definitely wrong. I moved "height" as well just so that it matched the code filling it in.
@ -397,0 +378,4 @@
" \"name\" (string) the name of the claim\n"
" \"value\" (string) claim metadata\n"
" \"claimId\" (string) the claimId of this claim\n"
" \"txid\" (string) the hash of the transaction which has successfully claimed this name\n"
👍 Thanks
Looks good to me. Has this been run through clang-format? It seems travis is only unhappy with style.
I have not ran the formatting. I'm having to undo 20 years of habit to remember to format before checking in. I still want to make that a report rather than a requirement.
It sounds like we would additionally like a report from Travis (or the local build) that states that the generated API doc needs to be updated. This involves starting the server. Have we done that on Travis before? Do we run any integration tests on Travis?
No. Possibly related to #164