Giter Club home page Giter Club logo

ssim_dc1's Introduction

Configuration, production, validation and tools for the DC1 Data Set.

The repository holds all of the issues, tools and validations used for the DC1 production. It also holds LaTex files which document the DC1 production strategy and techniques. There is a GitHub Issues list to track the work necessary to complete the production. See the Issues and Milestones for more information.

A high level summary of the production plan follows:

  • We will produce PhoSim and imSim e-image files.
  • We will also produce flats in order to correct for vignetting in the PhoSim files.
  • The current CI workflow needs to be tweaked to work at NERSC. We need to understand how to make both the workflow and DM work on full focal planes and more generally how to run in the time restricted batch environment there.
  • CI and SSim will specify the cookbooks (based on Twinkles experience). CI will run the jobs.
  • The DM output catalogs will be placed in a qserv database if possible.

Last edited by C. Walter (01/26/17)

ssim_dc1's People

Contributors

cwwalter avatar kadrlica avatar danielsf avatar fjaviersanchez avatar slosar avatar

Stargazers

Jaford Connor Burgad avatar James Chiang avatar Yao-Yuan Mao avatar Humna Awan avatar

Watchers

Phil Marshall avatar James Cloos avatar Rachel Mandelbaum avatar  avatar James Chiang avatar Josh Meyers avatar  avatar Seth Digel avatar Richard Dubois avatar Eve Kovacs avatar Salman Habib avatar Tom Glanzman avatar  avatar Heather Kelly avatar Pierre Antilogus avatar Glenn Sembroski avatar Eric Gawiser avatar Humna Awan avatar Katrin Heitmann avatar  avatar  avatar

Forkers

kadrlica wadawson

ssim_dc1's Issues

Produce new truth catalogs

I believe the current star and galaxy truth catalogs were produced by @jchiang87 by processing the instance catalogs.

Perhaps here?
https://github.com/LSSTDESC/pserv/tree/u/jchiang/add_imsim_truth/sql

One of consequences of this is that the disk and bulge components of galaxies are entered separately int the truth tables (as this is how they are simulated). We should probably generate the truth tables directly from the CatSim database. We should also consider adding information non in the instance catalogs such as the stellar population id.

Anything else we should include?

Write down details about dithering strategy/dithering effort

In this folder (https://github.com/LSSTDESC/SSim_DC1/tree/master/DC1_DESC_Note) I am writing down details about DC1. In principle, this is intended to be a DESC note and serve as documentation of what we've done. Potentially it could grow into a paper if we think this is enough (in my opinion it is but we'll discuss about it). This is a separated document from the one that describes the LSS analysis (which will just give a brief description of the products and refer to whatever we create here).

The details about the dithering strategy and implementation are missing.

I added a pseudo-outline about what's missing. @humnaawan, @egawiser could you please add something about the dithering strategy?

Please, feel free to add anything that you think is missing or that needs improvement.

Add people to DC1 Roadmap team for communication purposes

Dear @DarkEnergyScienceCollaboration/dc1-roadmap-team,

Another day to "Channel my inner Phil". We have our 1st milestone for DC1 production coming up at the end of the month. We need a way to communicate since we don't have a mailing list for this etc. I have followed Phil's suggestion and added everyone who has participated in an issue of added code to this repo to the dc1-roadmap-team.

I was very loose in who I added here. If you have interacted at all with this repo I added you :)

But, never fear you can easily remove yourself.

Go here:
https://github.com/orgs/DarkEnergyScienceCollaboration/teams/collaboration-council

and click the "leave" button and you won't be bothered by future emails.

If you are going to be involved with DC1 production it is best if you stay. If you have questions or need help go ahead and post here. Also, if there is anyone that should be included I missed (look it the URL above) please let me know.

Characterize the issues with the PhoSim background.

At the SUNY meeting we discussed how feasible it would be to either try to fix the background level for at least some of the exposures by adding a flat background on top of the sky source retraced ones, or possibly to consider if the field could be used as a kind of DDF test. @egawiser also made some relevant comments on correlated noise (Eric could you mention that here?).

We decided that the first thing to do would be to characterize how bad the situation was. This means we need to understand:

  • How much the depth varies over the entire target region (which is about four times the targeted area due to the rotation issue detailed in #36).

  • Since the amount of background suppression depends on the local phoSim "altitude" line characterize how much variation there was in the background level.

@fjaviersanchez Did you say you were working on these?

Validate single chip output

Look at one chip with 5 to 10 visits to validate the output after processing through DM. We should do this for a central and edge chip to see the effect of vignetting.

Update scripts in this repo

The notebooks in the repo are outdated. It would be good if we updated them to point to the correct paths and clean them from unnecessary stuff (for example we have things like the flags unpacking that should be now deprecated).

Make milestones

For example, what do we want done by the Oxford meeting, and what's a good intermediate step before that?

Specify LSS columns

A recap of email thread: @egawiser @cwwalter @jchiang87 @connolly @richardxdubois @ixkael

I'm moving this to a ticket, so that it is public and remains here for posterity.

Chris Walter:

During your meeting today could you discuss the following:

We have been meeting (SSim + CI) to get the DC1 production going. We need to know what columns you would like available in the (hopefully qserve) database that will be used to store the results of the DM running.

James Chiang:

The baseline schema for the Qserv tables are here:

https://lsst-web.ncsa.illinois.edu/schema/index.php?sVer=baseline

Perhaps have a look at the Object table,

https://lsst-web.ncsa.illinois.edu/schema/index.php?sVer=baseline&t=Object

We should add a DC1 landing page

We should add a page on the confluence that points to the locations DC1 dataset outputs and our resources and information for how read and use the data. I think there is no single place to point people to now.

Make sure simulated catalogs cover a large enough area

DC1 plans to simulate coverage of 4 contiguous LSST fields (hexagons) on the sky, with dithers allowing the central pointing to fall anywhere within one of those hexagons. A quick geometrical simulation is needed to determine the total footprint size and aspect ratio; in the SRM and related discussions, this was estimated to be 90 square degrees, but it's not a square. If this has not been done yet, it would be helpful to determine when it is needed by, what the most useful format is (do we need to specify which 4 fields in ra,dec?), and who is willing to undertake this.

Resolve 20 mas astrometry bias in the DC1 datasets

There is some discussion of this issue in #39. But, I wanted to put it into it's own issue and understand where we are on this and what our plan going forward is.

Using the dithered imSim catalog which have been matched with the reference catalog we see a 20 mas shift between the measured and true value. Stars:

unknown

and

unknown-2

Galaxies:

unknown-3

More detail on how these plots were made is here:

https://github.com/cwwalter/DC1_Analysis/blob/master/Notebooks/Star-Galaxy.ipynb

@fjaviersanchez has shown that similar results are seen in the PhoSim files.

Our hypothesis (from @ctslater) is that this is due to the fact that although we accounted for proper motion in our catalogs currently DM doesn't in the stack. See:

https://jira.lsstcorp.org/browse/DM-8828

@danielsf Made some plots of the proper velocity of the MW stars in CatSim (perhaps he can repost them below) but it still isn't exactly clear why we see what we do.

Another question is what model is used for the MW stars. Apparently their motion comes from a model from @mjuric. If he could add a reference that would be good.

If this is in fact the problem the solution is likely to wait for reprocessing with a version of the stack that properly includes proper motion. But, we should actually confirm we understand this and there are no extra issues.

The consensus seems to be that if we do understand it, it is not worth trying to "correct" for the effect before reprocessing. Please do comment if this is not the case.

Recommend max star brightness in DC1 runs

From the DC1 planning meeting (2016-09-22), the idea was to look at removing stars brighter than 10th (TBD) magnitude.

Email trail from Seth:

Not sure whether this belongs here, but I've started a Confluence page looking at the influence of bright stars on the CPU times for the Twinkles runs, based on Tom's current PhoSim-deep-pre3 runs, and phosim trimcat (i.e., input source list) files that Jim generated for the Opsim visit Tom is running (1668469). The giant instance catalog for the FOV gets divided up into 189 trimcat files, one per CCD. Evaluating total CPU times is tricky for these jobs with checkpointing (Tom can explain). So far I've looked only at the stellar content of the phosim jobs that have been running for weeks. Formerly such long times were not possible on the batch farm, which has a 120-CPU hour limit, but the checkpointing has effectively enabled it.

The bottom line so far is that the brightest few stars (two have magnitudes <7 in the instance catalog) have a huge influence on the CPU time requirements, not just at their locations but over a fairly large area.

Once we've worked out CPU time tallies, then we could consider out what magnitude limits might need to be imposed in the instance catalogs for the next Twinkles run.

Run L2 pipeline on a subset of DC1 data using v14_0 (or later)

It would be useful to run the L2 pipeline on some of the DC1 imsim-dithered data using a more modern version of the Stack, e.g., v14_0. This will enable us to

  • use the new QA tools to see the differences between the old and new L2 outputs,
  • prepare for running the L2 pipeline on the protoDC2 and DC2 data.

Heads up for DC2

As the analysis for DC1 begins and efforts for DC2 ramp up, I wanted to report a few results. We tried finding the DC1 chip list using a different method as a cross-check; thanks to @danielsf. Comparing the numbers, we find that the other method produces 8434 more chips overall (~4.5% of the simulated DC1 science sensors) but misses some that are produced in the current list. Looking at the exact positions, the chips that are missing in one dataset and present in the other either lie at the boundary of the DC1 region or are (almost) outside the nominal circular FOV, e.g., see this short simulation for some examples:

fid1447_both1024_subset

It isn’t clear if there’s a quick solution to the problem as the high HEALPix resolution to find the chips causes some pixels near the boundaries (FOV’s or DC1 region) to be deemed outside the visit, hence missing some chips (that are on the relevant raft and/or are in the region). While the impacts of these missing visits should be negligible for DC1**, we might need to think about a clever way to find all the chips that need to be simulated for DC2. Of course, a metric to estimate the expected number of chips (as was suggested in one of the SSim telecons) would help check whether the chip-finding technique is effective.

**If the missing chips were uniformly within the FOV region (which they are not and hence are not that critical), the 4.5% missing sensors would translate to a loss of about 0.02mag in differential depth.

DC1 dataproduct retention

DC1 PhoSim generated 5 TB of simulated images
DC1 DM pipeline read those images and generated 75 TB of output data

This is a big problem.

The PhoSim data were stored in DESC "project space", which now contains 65 TB of an 81 TB quota. The DM pipeline data were stored in global scratch space. Technically, scratch space is limited to 20 TB per user so we are way over quota (quotas do not seem to be enforced at present). In addition, scratch directories are purged after 12 weeks -- unless the data is accessed, which resets the clock. Purging has not been done up until now, but NERSC has recently warned it was about to start.

Here is a look at the DM pipeline directory, along with a summary of space consumed:

tony_j@cori17:/global/cscratch1/sd/descdm/DC1/DC1-imsim-dithered> du -hs *

Space used Directory

4.0K _mapper

28T calexp

620K config

21T deepCoadd

188G deepCoadd-results

4.9M deep_assembleCoadd_metadata

125M deep_makeCoaddTempExp_metadata

838M eimage

28T icExp

195G icSrc

12G processEimage_metadata

243M ref_cats

17M registry.sqlite
3
408K schema

2.0T src

2.6G srcMatch

What to do with these data? One suggestion was to move it to HPSS (tape). But doing so does not look all that attractive. Such a move would:

  1. require some detailed analysis of how to package the data to simultaneously satisfy the requirements of HPSS (bundle files into tar-balls of 10s to 100s of GB), and to make sensible restoration packages from the DM stack/analysis perspective
  2. require writing a script to implement the above
  3. require users to learn how to restore portions of these data when needed
  4. more than consume our SRU (Storage Resource Unit) allocation for the year

Another suggestion is to prune the extraneous/unneeded/unwanted files. Can we agree on what to prune and how to do it?

Investigate adding cores to Edison/Cori shared partition

At 2500 cores, Tom estimates 3/4 yr to run DC1. Gulp. The least work mode for DESC currently is to use the shared partition. To be useful we would need factors more than 2500 cores. @djbard was asked to investigate upwards of 10k cores in the shared partition.

Help with Notebook Tutorials

At the Argonne sprint week it was suggested that some of the notebook tutorials could be updated and that the documentation of the content of notebooks could be made clearer. I'm going to try to tackle this with the hope that in the process I'll learn something about DC1.

Modify DM workflow for DC1 production

The current CI workflow needs to be tweaked to work at NERSC. We need to understand how to make both the workflow and DM work on full focal planes and more generally how to run in the time restricted batch environment there.

DC1 phoSim production

This issue is intended to be a continuous log of the DC1 phoSim production at NERSC.

To start things off, a summary update of this project was given on Monday (12 Dec 2016) in the DESC-CI meeting (https://confluence.slac.stanford.edu/x/SryMCw). The initial workflow is being developed to include the following features:

  • phoSim v3.6.0 (released 2 Dec 2016)
  • cluster_submit technology (splits instance catalog trimming from raytracing, and parallelizes at chip level)
  • multi-threaded operation (with a modest number of threads, probably 4-16)
  • operation on NERSC's cori-haswell using a new 'pilot job' feature of the SLAC workflow engine
  • inclusion of dmtcp checkpointing to allow for very long-running simulated visits

Would like to get the first test runs going in the next week or so...but possibly not until after the holidays.

Stay tuned!

Update for processEimage.py configs in Stack w.2017.1 (and later?)

In the Stack weekly build w.2017.1, processEimage.py has some problems with the eimage files being produced for DC1 using imSim. The relevant discussion started on Slack here.

The upshot is that @SimonKrughoff recommends the following config change:

⁠⁠⁠⁠calibrate.astrometry.matcher.numBrightStars=100⁠⁠⁠⁠

This version of the Stack has not been tested on PhoSim data yet.

DC1 phoSim configuration

For the record, below is the current command/override file being used in the DC1 tests (and as used in Twinkles). It was also presented at last Monday's CI meeting.

cleardefects
clearclouds
airglowvariation 0
fringing 0
contaminationmode 0

I have heard Chris say that using the same set of settings as Twinkles is a good thing. But John (and others?) have suggested turning on various elements, at least partially due to our using phoSim v3.6.0. What shall it be?

Set maximum star magnitudes and record positions in DC1 pipeline.

Based on the discussion in issue #19 we have decided to set a maximum magnitude of 10, replacing brighter stars with this magnitude and recording for which objects and their positions this change was made. The LSS group will use this information (along with a few fully simulated bright stars) to make appropriate masks.

This issue is to record the request and make sure the feature is implemented.

DC1 Chips To Simulate

@cwwalter, @danielsf There appears to be an important but subtle bug in the chip-finding algorithm s.t. there should be more chips to simulate than the ones in the current pickle output. The symptom that led @egawiser and I to discover it is the bimodality in the chip count for the dithered survey:

screen shot 2017-04-03 at 5 36 51 pm

I checked to make sure that the dithering was implemented correctly: the dithered pointings do move enough that more than half of the neighboring chips should be coming into the proposed DC1 region — something that the gap in the bimodal histogram does not reflect.

After quite a bit of debugging (many thanks to @danielsf for trying an independent method to confirm our concerns), it appears that the slicer interface is not working properly (specifically the routine, slicer.getSimData, that is supposed to give all the instances of observations in a given data slice, i.e., a HEALPix pixel). I don’t fully understand why the problem is arising.

While I further investigate this apparent source of the problem, I am running a workaround that bypasses that slicer-dependent interface with simdata. We don't think the current chips-to-simulate list has extraneous chips but is simply missing some. We suspect an increase of 30-50% in the chip count and should know the exact number once the new run finishes in ~24 hours.

It is unclear to us what is the best way forwards in terms of how/when to run the additional chips through PhoSim.  If the order of PhoSim runs is unimportant, we could allow the current set to finish while investigating this further and then submit the additional jobs for chips in the new pickle list that are not in the initial one. We apologize for not identifying and fixing this problem sooner!

Track items for which reprocessing would be helpful

As we find issues and missing items in the current DC1 processing we might want to think about doing a DM reprocess at some point (probably after then next major DM release). This issue is to track items (or upgrades) we know about so we can make an informed decision at that time. Add new items to this list as they come up.

  • The stack now has a new/different astrometry solver.
  • We didn't include the algorithm in processing that calculates "blendedness" (See DM-10506).
  • Proper motion is not yet supported by DM in reference catalogs (See DM-8828). We included it in our catalogs.

Ingest DC1 coadd catalogs into a MySQL database

I've done this for the imsim undithered data, the Level 2 output of which is available at NERSC:

/global/cscratch1/sd/descdm/DC1/full_focalplane_undithered

For now, I've put the data in the DESC_Twinkles_Level_2 database on scidb1.nersc.gov. NERSC has provided a DESC_DC1_Level_2 database on nerscdb04.nersc.gov, but that db isn't configured to accept loading via csv files (which is faster by O(10) than insert commands), so I've used the Twinkles db until the DC1 db is reconfigured.

The schema for the Coadd_Object table is derived from the FITS tables in the coadd merged object catalogs, e.g., /global/cscratch1/sd/descdm/DC1/full_focalplane_undithered/deepCoadd-results/merged/0/10,10/ref-0-10,10.fits. I've added columns (as part of the primary key along with the id column) to contain the patch and projectId, respectively. The patch can serve as part of a rough spatial query, in addition to identifying the coadd image where an object can be found. projectId will be used to differentiate the Level 2 catalog results between the imsim undithered, imsim dithered, and phosim DC1 datasets.

Here is example code showing how to obtain query results as a pandas DataFrame:

import desc.pserv

#db_info = dict(host='nerscdb04.nersc.gov',                                      
#               database='DESC_DC1_Level_2')                                     
db_info = dict(host='scidb1.nersc.gov',
               database='DESC_Twinkles_Level_2')

connection = desc.pserv.DbConnection(**db_info)

query = '''select * from Coadd_Object co join Project pr on                      
           co.projectId=pr.projectId where co.patch="'10,10'" and                
           pr.projectName="DC1 imsim undithered" limit 10'''

#query = '''select * from Coadd_Object where patch="'10,10'" and projectId=0'''  

df = connection.get_pandas_data_frame(query)

The latter, commented-out query can be used to avoid the join with the Project table if you know the projectId. Instructions for setting up and using the pserv package are available at the pserv repo.

Ingest simulation truth info on DC1 objects in db tables

The object (~instance) catalogs that were used by the imsim simulation runs contain the input truth information for each simulated object, i.e., ICRS coordinates, SEDs, and Galactic extinction for stars, plus redshifts, sersic2d parameters and intrinsic reddening for galaxies. I plan to use sims_photUtils to compute ugrizy apparent magnitudes and will put all this info into tables in the DESC_DC1_Level_2 db.

Draft schema for separate star and galaxy tables are in the pserv repo:

https://github.com/LSSTDESC/pserv/blob/u/jchiang/add_imsim_truth/sql/create_StarTruth.sql
https://github.com/LSSTDESC/pserv/blob/u/jchiang/add_imsim_truth/sql/create_GalaxyTruth.sql

and the code to compute the magnitudes is at this imSim PR:

https://github.com/LSSTDESC/imSim/pull/61/files

Comments/suggestions welcome.

Finish the 1st DC1 Milestone.

Dear All,

We are one week away from our 1st DC1 milestone:

"DC1 Production Plan"

'Finalize the DC1 Production Plan including answering all of the HW questions and tasks created at the Oxford Meeting.'

https://github.com/DarkEnergyScienceCollaboration/SSim_DC1_Roadmap/milestone/1

There are three outstanding issues and I have commented this morning on all of them so we can try to bring them to a close in the next few days.

FYI: our next milestone which will be for September and will be to 'Make and validate DC1 Test Samples'. People can start to look and fill out details if they want here:

https://github.com/DarkEnergyScienceCollaboration/SSim_DC1_Roadmap/milestone/2

Differing coordinate system conventions in CatSim and PhoSim

DC1 data generation has recently been hampered by an apparent difference in convention between how CatSim converts RA, Dec into coordinates on the LSST focal plane and how PhoSim makes the same conversion. Specifically, to maximize computational efficiency, we used CatSim to find a list of individual chips that were in the DC1 region on the sky after a series of random rotational dithers had been applied to the minion_1016 OpSim observing cadence. Because PhoSim and CatSim disagree on what a camera rotation of +90 degrees means, chips which CatSim declared to be inside the DC1 region were simulated by PhoSim outside of the DC1 region. Below, I will try to quantify the differences between how CatSim converts from RA, Dec to focal plane position and how PhoSim converts from RA, Dec to focal plane position. The two conversions adopt different sign conventions both in terms of the camera rotator angle and in how displacement to the East on the sky corresponds to displacement on the focal plane. I have not found a definitive document allowing me to declare which convention is "correct."

It may be useful to compare this discussion to the discussion here, where the handedness of the PhoSim WCS is shown to be opposite that of the WCS used by ImSim.

LSSTDESC/imSim#39

Some references

A confluence page describing the LSST focal plane

https://confluence.lsstcorp.org/display/LSWUG/Representation+of+a+Camera

The relevant diagram is here:

confluence_image

Official camera team diagrams of the focal plane can be found in DocuShare under LCA-13381. Here is a screenshot of one of the more salient diagrams

camera_team_diagram

I believe these diagrams are consistent with each other. Note that in both R:3,4 is in the +x direction from R:1,4 and R:1,4 is in the +y direction from R:1,0

A test

I generated a series of 12 InstanceCatalogs, each containing only one source. In the first 4 catalogs, I set rotSkyPos=0 and used CatSim's methods to place the object at the center of a specific raft. I then simulated the image corresponding to each catalog with PhoSim. Below, I present the chip that CatSim thought the object was on, the chip that PhoSim simulated the object on, the focal plane coordinates reported by CatSim, and the RA, Dec of the object. In all cases, the telescope is pointed at RA=33.93, Dec=-6.7

CatSim chip PhoSim chip Focal plane x (CatSim; mm) Focal plane y (CatSim; mm) RA Dec
R:1,4 S:1,1 R:3,4 S:1,1 -127 254 34.64 -5.30
R:3,4 S:1,1 R:1,4 S:1,1 127 254 33.22 -5.30
R:1,0 S:1,1 R:3,0 S:1,1 -127 -254 34.64 -8.12
R:3,0 S:1,1 R:1,0 S:1,1 127 -254 33.22 -8.12

Things to note:

  • The chips and focal plane coordinates reported by CatSim appear to be self-consistent (though this may be a tautology, as the CatSim methods that calculate focal plane coordinates and the CatSim methods that find chips were written, validated, and tested together)
  • PhoSim appears to be reflected about the y-axis with respect to CatSim (though, again, that assumes that you trust the focal plane coordinates reported by CatSim). This corresponds to a disagreement over how displacement to the East on the sky corresponds to displacement in the x direction on the focal plane.

For the other 8 catalogs, I used the same 4 RA, Dec positions of the objects, but changed rotSkyPos to see how the objects moved across the focal plane. Below I will report the value of rotSkyPos, the original rotSkyPos==0 CatSim chip, the rotSkyPos!=0 CatSim chip, the original PhoSim chip, the rotSkyPos!=0 chip, and the focal plane coordinates reported by CatSim with rotSkyPos!=0 (again, in mm)

rotSkyPos CatSim rot==0 CatSim rot!=0 PhoSim rot==0 PhoSim rot!=0 Focal x Focal y
45.0 R:1,4 S:1,1 R:0,3 S:1,1 R:3,4 S:1,1 R:4,3 S:1,0 -269 90
45.0 R:3,4 S:1,1 R:1,4 S:2,1 R:1,4 S:1,1 R:3,4 S:0,1 -90 270
45.0 R:1,0 S:1,1 R:3,0 S:0,1 R:3,0 S:1,1 R:1,0 S:2,1 90 -270
45.0 R:3,0 S:1,1 R:4,1 S:1,2 R:1,0 S:1,1 R:0,1 S:1,2 270 -90
90.0 R:1,4 S:1,1 R:0,1 S:1,1 R:3,4 S:1,1 R:4,1 S:1,1 -254 -127
90.0 R:3,4 S:1,1 R:0,3 S:1,1 R:1,4 S:1,1 R:4,3 S:1,1 -254 127
90.0 R:1,0 S:1,1 R:4,1 S:1,1 R:3,0 S:1,1 R:0,1 S:1,1 254 -127
90.0 R:3,0 S:1,1 R:4,3 S:1,1 R:1,0 S:1,1 R:0,3 S:1,1 254 127

Comparing to this image from the Confluence page referenced above

confluence_image

we see the following

  • As rotSkyPos increases, the objects in CatSim rotate clockwise in the focal plane diagram
  • As rotSkyPos increases, the objects as simulated in PhoSim rotate counter-clockwise in the focal plane diagram

The CatSim documentation for the meaning of rotSkyPos can be found here

https://github.com/lsst/sims_utils/blob/master/python/lsst/sims/utils/ObservationMetaData.py#L74

It can be summarized

rotSkyPos North in focal plane East in focal plane
0.0 +y -x
90.0 -x -y
180.0 -y +x
-90.0 +x +y

This is consistent with objects rotating clockwise in the focal plane as rotSkyPos increases.

create dask dataframes for DC1 data with analysis flags

This should be done for the Level 2 results in

/global/cscratch1/sd/descdm/DC1/rerun/DC1-imsim-dithered/
/global/cscratch1/sd/descdm/DC1/rerun/DC1-imsim-undithered/
/global/cscratch1/sd/descdm/DC1/DC1-phoSim-3a/

at NERSC.

Here are pointers from Chris (via #desc-ci) on the dask dataframe generation script:

The simple script is
sql-query.py
in
/global/u1/c/cwalter/SSim_DC1/Notebooks/Dask
(it’s not actually in the repo)
It’s simple I was just uncommenting and commenting the various lines for the 3 cases.

PhoSim indexing fails with very large numbers of input objects

This is a re-post from @rbiswas4 at LSSTDESC/Twinkles#383:

PhoSim indexes astrophysical sources with floats. For very large integers used as instance catalog indices , this causes indexing failures due to numerical rounding issues. This was found in the first set of Twinkles runs. A ticket describing the problem - PHOSIM-27 - was filed on the PhoSim Jira.

For Twinkles Run 3, this problem was reported in LSSTDESC/Twinkles#312 and taking advantage of the restricted nature (single CCD, rather than all sky), this issue was solved for Twinkles Run 3 in LSSTDESC/Twinkles#340 . However, in general (where there will be more objects due to larger sky area) the problem is unsolved.

Prepare subset (single patch?) of data to transfer to lsst-dev for QA setup

@laurenam from DM at Princeton has a set of QA scripts that make diagnostic plots to check the quality of the DM processing. Look here to see examples used on HSC:

https://jira.lsstcorp.org/browse/DM-10044

If we make a subset of the data (perhaps one patch?) and get it to lsst-dev, Lauren has offered to make the tweaks to get her scripts to run on our output. Thanks Lauren! Then, we can install and run them ourselves at NERSC on our full output and later runs.

So the first thing is understanding what part of the output is actually needed by Lauren and if it is easy to just copy a sub-directory from the output directories. Then, someone with lsst-dev access will need to copy them to the appropriate area.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.