Skip to content

Single stream processing#49

Draft
klendathu2k wants to merge 82 commits intomainfrom
single-stream-rebase
Draft

Single stream processing#49
klendathu2k wants to merge 82 commits intomainfrom
single-stream-rebase

Conversation

@klendathu2k
Copy link
Copy Markdown
Owner

Work in progress. Submits streaming detectors by host.

… and dataset rows in the filecatalog / datasets table.
…e of the table doesn't change out from under us.
…tput file minus the .root extension) to the production status entry.
…ction setup. We will replace the $(streamname) token with _X_ to shorten it and eliminate characters that might otherwise need to be escaped.
@klendathu2k
Copy link
Copy Markdown
Owner Author

This PR ought to merge in the changes necessary to support single stream processing in the streaming event builder.

@klendathu2k klendathu2k marked this pull request as ready for review January 2, 2025 02:23
Comment thread cups.py



def update_production_status( update_query, retries=10, delay=10.0 ):
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like a zombie method... removed in previous PRs, but this PR is bringing
it back from the dead.

Comment thread slurp.py
Comment thread slurp.py
#pprint.pprint(values)
#exit(0)

statusdbw.execute(insert)
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: switch to dbQuery.

Comment thread slurp.py

statusdbw.execute( insert )
statusdbw.commit()
try:
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: dbQuery

Comment thread slurp.py

# Build dictionary of DSTs existing in the datasets table of the file catalog. For every DST that is in this list,
# we know that we do not have to produce it if it appears w/in the outputs list.
dsttype="%s_%s_%s"%(name,build,tag) # dsttype aka name above
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TODO: May want to use 'name_' in the dsttype... I believe we want streamname substituted here.

Comment thread slurp.py

for output_, tuple_ in dstnames.items():
dt, ds = tuple_
exists = { c.filename : ( c.runnumber, c.segment ) for c in fccro.execute( f"select filename, runnumber, segment from datasets where runnumber>={runMin} and runnumber<={runMax} and dsttype='{dt}' and dataset='{ds}'" ) }
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is very pythonic... but we should be using the standard dbQuery function for the database access.

Comment thread slurp.py
on lfnlist.filename=files.lfn;
"""
#print(fcquery)
lfn2pfn = { r.lfn : r.pfn for r in fccro.execute( fcquery ) }
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again with the dbQuery comment.

Comment thread slurp.py
# and (3) the hash of the local github repository where the payload scripts/macros are found.
#
repo_dir = payload #'/'.join(payload.split('/')[1:])
repo_dir = payload
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

may need to forcibly strip off whitespace and/or a trailing '/' from the path. TBD.

Copy link
Copy Markdown
Owner Author

@klendathu2k klendathu2k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be ready to merge and test.

@klendathu2k klendathu2k marked this pull request as draft January 2, 2025 15:29
@klendathu2k
Copy link
Copy Markdown
Owner Author

Note... this may have been superseded by PR # 64...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants