added scripts to generate parsable API docs #187

Merged
BrannonKing merged 2 commits from add_api_docs_scripts into master 2018-08-30 22:18:49 +02:00
BrannonKing commented 2018-08-08 17:27:37 +02:00 (Migrated from github.com)
Issue: #131 . There are two scripts: 1. jrgen, matching https://github.com/mzernetsch/jrgen 2. "v1" which matches https://github.com/lbryio/lbry/blob/master/scripts/generate_json_api.py
eukreign commented 2018-08-08 19:48:22 +02:00 (Migrated from github.com)

Seems like there is a mix of py2 and py3 exclusive syntax used (print keyword in contrib/devtools/clang-format.py which will fail in py3 as syntax error and def get_obj_from_dirty_text(full_object: str): which will fail as syntax error in py2).

Is this intentional? Should all code be only py2, py3 or should it work in both py2 and py3?

Seems like there is a mix of py2 and py3 exclusive syntax used (`print` keyword in `contrib/devtools/clang-format.py` which will fail in py3 as syntax error and `def get_obj_from_dirty_text(full_object: str):` which will fail as syntax error in py2). Is this intentional? Should all code be only py2, py3 or should it work in both py2 and py3?
BrannonKing commented 2018-08-08 20:25:27 +02:00 (Migrated from github.com)

@eukreign , the clang-format scripts predate my time here. I just made it run as it didn't before. I don't know that anyone here wrote them, judging from the copyright messages in them. As far as I know, they have always been Py2 scripts. For this PR, the focus should be on the "docs" scripts.

@eukreign , the clang-format scripts predate my time here. I just made it run as it didn't before. I don't know that anyone here wrote them, judging from the copyright messages in them. As far as I know, they have always been Py2 scripts. For this PR, the focus should be on the "docs" scripts.
eukreign commented 2018-08-08 20:29:04 +02:00 (Migrated from github.com)

Alright, it was in this PR so I thought it was related. Which files should I review as part of this PR?

Alright, it was in this PR so I thought it was related. Which files should I review as part of this PR?
eukreign (Migrated from github.com) approved these changes 2018-08-09 04:31:32 +02:00
eukreign (Migrated from github.com) left a comment

I've reviewed generate_json_api_jrgen.py and generate_json_api_v1.py. I like the Python type annotations!

My only hesitation is regarding get_obj_from_dirty_text in generate_json_api_jrgen.py, it's yuuuge! Also, that's a really big try/except; I would prefer to move that into the caller.

I'm approving since I think my critique above is just preference and I think this code is fine otherwise.

I've reviewed `generate_json_api_jrgen.py` and `generate_json_api_v1.py`. I like the Python type annotations! My only hesitation is regarding `get_obj_from_dirty_text` in `generate_json_api_jrgen.py`, it's *yuuuge*! Also, that's a really big try/except; I would prefer to move that into the caller. I'm approving since I think my critique above is just preference and I think this code is fine otherwise.
lyoshenka commented 2018-08-10 17:40:52 +02:00 (Migrated from github.com)

@BrannonKing can this be merged? also as part of this, can you run the script (the one that matches the daemon script) and commit the output as well?

@BrannonKing can this be merged? also as part of this, can you run the script (the one that matches the daemon script) and commit the output as well?
BrannonKing commented 2018-08-10 20:04:33 +02:00 (Migrated from github.com)

Open questions on this:

  1. Should we check in the generated output as well? (Anyone making changes to the API would be responsible to rerun the script and check in an updated file.)
  2. Should we make the reproducible build script run the document generation?
  3. Should we make TravisCI run the document generation? And tag changes? Or have it check in the output?

We have a request to make the output available on the web.

Open questions on this: 1. Should we check in the generated output as well? (Anyone making changes to the API would be responsible to rerun the script and check in an updated file.) 2. Should we make the reproducible build script run the document generation? 3. Should we make TravisCI run the document generation? And tag changes? Or have it check in the output? We have a request to make the output available on the web.
eukreign commented 2018-08-10 20:39:35 +02:00 (Migrated from github.com)

Following from my answer on slack about how to unit test these scripts: to be able to use the generated output as the test itself, you have to commit and version the output deliberately (not via travis) so that when the docs do change it can be verified for correctness (did the docs change because the input source changed or did they change because the generator script changed?). Therefore, my advice would be:

  1. Yes.
  2. Don't know what this means, something lbrycrd specific I assume?
  3. If you have extra time to dedicate to this task then definitely it would be beneficial to run the generate script and verify that the latest output matches the previously commited output and fail the build if it doesn't (asking commiter to run the doc build script locally, verify the changes make sense and then commit the new json doc file).
Following from my answer on slack about how to unit test these scripts: to be able to use the generated output as the test itself, you have to commit and version the output deliberately (not via travis) so that when the docs do change it can be verified for correctness (did the docs change because the input source changed or did they change because the generator script changed?). Therefore, my advice would be: 1. Yes. 2. Don't know what this means, something lbrycrd specific I assume? 3. If you have extra time to dedicate to this task then definitely it would be beneficial to run the generate script and verify that the latest output matches the previously commited output and fail the build if it doesn't (asking commiter to run the doc build script locally, verify the changes make sense and then commit the new json doc file).
lbrynaut (Migrated from github.com) reviewed 2018-08-15 18:44:26 +02:00
@ -397,0 +378,4 @@
" \"name\" (string) the name of the claim\n"
" \"value\" (string) claim metadata\n"
" \"claimId\" (string) the claimId of this claim\n"
" \"txid\" (string) the hash of the transaction which has successfully claimed this name\n"
lbrynaut (Migrated from github.com) commented 2018-08-15 18:44:26 +02:00

Just curious why you're re-ordering here. Is this related to something about the doc generation?

Just curious why you're re-ordering here. Is this related to something about the doc generation?
BrannonKing (Migrated from github.com) reviewed 2018-08-15 19:03:24 +02:00
@ -397,0 +378,4 @@
" \"name\" (string) the name of the claim\n"
" \"value\" (string) claim metadata\n"
" \"claimId\" (string) the claimId of this claim\n"
" \"txid\" (string) the hash of the transaction which has successfully claimed this name\n"
BrannonKing (Migrated from github.com) commented 2018-08-15 19:03:24 +02:00

"valid at height" was between the supports array declaration and its definition. That was definitely wrong. I moved "height" as well just so that it matched the code filling it in.

"valid at height" was between the supports array declaration and its definition. That was definitely wrong. I moved "height" as well just so that it matched the code filling it in.
lbrynaut (Migrated from github.com) reviewed 2018-08-15 20:08:58 +02:00
@ -397,0 +378,4 @@
" \"name\" (string) the name of the claim\n"
" \"value\" (string) claim metadata\n"
" \"claimId\" (string) the claimId of this claim\n"
" \"txid\" (string) the hash of the transaction which has successfully claimed this name\n"
lbrynaut (Migrated from github.com) commented 2018-08-15 20:08:58 +02:00

👍 Thanks

:+1: Thanks
lbrynaut commented 2018-08-15 20:11:09 +02:00 (Migrated from github.com)

Looks good to me. Has this been run through clang-format? It seems travis is only unhappy with style.

Looks good to me. Has this been run through clang-format? It seems travis is only unhappy with style.
BrannonKing commented 2018-08-15 20:46:02 +02:00 (Migrated from github.com)

I have not ran the formatting. I'm having to undo 20 years of habit to remember to format before checking in. I still want to make that a report rather than a requirement.

It sounds like we would additionally like a report from Travis (or the local build) that states that the generated API doc needs to be updated. This involves starting the server. Have we done that on Travis before? Do we run any integration tests on Travis?

I have not ran the formatting. I'm having to undo 20 years of habit to remember to format before checking in. I still want to make that a report rather than a requirement. It sounds like we would additionally like a report from Travis (or the local build) that states that the generated API doc needs to be updated. This involves starting the server. Have we done that on Travis before? Do we run any integration tests on Travis?
lbrynaut commented 2018-08-16 18:43:56 +02:00 (Migrated from github.com)

Do we run any integration tests on Travis?

No. Possibly related to #164

> Do we run any integration tests on Travis? No. Possibly related to #164
Sign in to join this conversation.
No reviewers
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: LBRYCommunity/lbrycrd#187
No description provided.