Cookie consent

AstroBin saves small pieces of text information (cookies) on your device in order to deliver better content and for statistical purposes. You can disable the usage of cookies by changing the settings of your browser. By browsing AstroBin without changing the browser settings, you grant us permission to store that information on your device.

I agree

TOP PICK clarification request

a028964
02 Dec, 2019 18:28
Jarrett Trezzo
First, to answer Ruben's original questions for transparency. I am currently a reviewer and previously a submitter. I did not see these submitted to the review queue.NGC 7000:
1. The image has zero no information entered. This is pretty much a disqualification right from the get go for most. Take the time to fill out all the capture details, or at least enter them into the description.
2. Bright and common targets like this have hundreds of versions by different people every year. They get tiring, and it's really hard to come across one where you say "wow, that is so much better than all the others." I think a lot of people just get sick of looking at them. In this sense a rare, difficult, or otherwise more interesting target will probably have a better chance at being selected.

Cone Nebula:
1. Again, lacking full information - the exposure detail would be nice - but I think big remote scopes and Hubble data etc. receive less interest (is this the free public data set they released?).
2. Same as above #2.

Brad
And how many current Submitters are there?
Of the original 20, there are 10 that still submit. I'm not sure how many reviewers. Many have just given up or quit in frustration I believe.

Brad
The next level then would be the reviewers … are they overwhelmed.
Absolutely not - we are underwhelmed if anything. Sometimes it's hard to pick from the review queue on a given day either because the submitted images are not that great or there aren't enough to pick from, or a combination thereof.

Wow. So if I were to make a slightly educated guess, we aren't really talking about subjectivity really. I mean it is in play but this is more of a math problem than a subjectivity problem. Right?

With the drop in 10 Submitters, did the remaining Submitter queue effectively double?
a028964
02 Dec, 2019 18:29
Brad
Jarrett Trezzo
First, to answer Ruben's original questions for transparency. I am currently a reviewer and previously a submitter. I did not see these submitted to the review queue.NGC 7000:1. The image has zero no information entered. This is pretty much a disqualification right from the get go for most. Take the time to fill out all the capture details, or at least enter them into the description.
2. Bright and common targets like this have hundreds of versions by different people every year. They get tiring, and it's really hard to come across one where you say "wow, that is so much better than all the others." I think a lot of people just get sick of looking at them. In this sense a rare, difficult, or otherwise more interesting target will probably have a better chance at being selected.

Cone Nebula:
1. Again, lacking full information - the exposure detail would be nice - but I think big remote scopes and Hubble data etc. receive less interest (is this the free public data set they released?).
2. Same as above #2.

Brad
And how many current Submitters are there?
Of the original 20, there are 10 that still submit. I'm not sure how many reviewers. Many have just given up or quit in frustration I believe.

Brad
The next level then would be the reviewers … are they overwhelmed.
Absolutely not - we are underwhelmed if anything. Sometimes it's hard to pick from the review queue on a given day either because the submitted images are not that great or there aren't enough to pick from, or a combination thereof.

Wow. So if I were to make a slightly educated guess, we aren't really talking about subjectivity really. I mean it is in play but this is more of a math problem than a subjectivity problem. Right?

With the drop in 10 Submitters, did the remaining Submitter queue effectively double?

Or are there 10 queues with no eyes on them?
Chris-PA
02 Dec, 2019 18:30
Yikes - 100 a day? Maybe I'm not qualified to be a submitter after all. I sometimes go weeks without checking my feed. Does this mean that every image published is reviewed this way? I've never been 'behind the scenes,' but from an outsider perspective, I feel there has to be a less overwhelming way. No idea what that would be though (other than more volunteers or perhaps some non-queue method???).

A couple random thoughts: I absolutely love George's suggestion of considering older images. The fact of the matter is that images that deserve recognition get overlooked (and it's no surprise that this happens now that we know a little more about what's going on behind the scenes). I disagree with Steve Milne's statement that the occasional worthy image doesn't get a Top Pick is irrelevant - I mean, of course in the grand scheme of life on Earth, yes, obviously. But I remember someone in a German astro group on Facebook last year who did do quite an amazing image (his first such top notch image) that failed to get a nod. He had tried so hard to make something worthy (and succeeded, I might add) and got so frustrated that he deleted his Astrobin account. If you ask me, it was an overreaction, but I get where he was coming from. You put a major effort into achieving something no one has achieved before in this hobby and no one notices.

Last month I had asked if there were a way for non-submitters to submit worthy overlooked images. Maybe this could be looked into more? I would like to volunteer, but I don't know if I could review that many images (in fact, I know I couldn't). But maybe a special button next to like or bookmark that could shoot an image to a review queue? Like if it's hit a certain number of times by non-submitters? Or maybe it could be simplified and any image that gets a certain number of bookmarks gets bumped to a special 'give this one a little more attention' queue? I'm just spitballing here…
a028964
02 Dec, 2019 18:34
Chris Sullivan
Yikes - 100 a day? Maybe I'm not qualified to be a submitter after all. I sometimes go weeks without checking my feed. Does this mean that every image published is reviewed this way? I've never been 'behind the scenes,' but from an outsider perspective, I feel there has to be a less overwhelming way. No idea what that would be though (other than more volunteers or perhaps some non-queue method???).A couple random thoughts: I absolutely love George's suggestion of considering older images. The fact of the matter is that images that deserve recognition get overlooked (and it's no surprise that this happens now that we know a little more about what's going on behind the scenes). I disagree with Steve Milne's statement that the occasional worthy image doesn't get a Top Pick is irrelevant - I mean, of course in the grand scheme of life on Earth, yes, obviously. But I remember someone in a German astro group on Facebook last year who did do quite an amazing image (his first such top notch image) that failed to get a nod. He had tried so hard to make something worthy (and succeeded, I might add) and got so frustrated that he deleted his Astrobin account. If you ask me, it was an overreaction, but I get where he was coming from. You put a major effort into achieving something no one has achieved before in this hobby and no one notices.

Last month I had asked if there were a way for non-submitters to submit worthy overlooked images. Maybe this could be looked into more? I would like to volunteer, but I don't know if I could review that many images (in fact, I know I couldn't). But maybe a special button next to like or bookmark that could shoot an image to a review queue? Like if it's hit a certain number of times by non-submitters? Or maybe it could be simplified and any image that gets a certain number of bookmarks gets bumped to a special 'give this one a little more attention' queue? I'm just spitballing here…

Exactly where I feel I'm leading with the math problem for the first wave. This probably realistically needs to be more based on the 'community' in some way. Again formulaic in some way as to allow things to rise to the top that should be considered. There could still be a Submitter type function, it makes sense. But having that be the bottle neck seems unfair to both - Submitter and Community
TimothyTim
02 Dec, 2019 18:44
Maybe AB should shift some of the responsibility of picking the award pictures a bit to the publisher. For example I have many images I wouldn’t even consider putting them to contention but there are a few I’m happy with I might. This should significantly decrease the work load on the committee.
Snjór
02 Dec, 2019 18:46
Ruben Barbosa
Hi everybody,I have a question for AB submitters and reviweres.

For the sake of transparency of the image selection criteria (Top Picks / IOTD), I would like a short clarification.

Recently I posted 2 images, NGC 7000 (https://www.astrobin.com/kqgqyl) and The Cone Nebula (https://www.astrobin.com/euetw9/), both with almost 200 likes, several comments where members liked the processing, the detail, the color, the resolution, etc.

As I share the same sentiment from the members who commented the images, and find that the AB jury had a different opinion, then I would like to be clarified on the following questions:

1. Did submitters vote for these images? If not, I would like to know why the two images do not deserve to be TP?

2. Did reviewers vote for these images? If not, I would like to know why the two images do not deserve to be TP?

I think no one is offended by this request for clarification, but I often see TP images that do not deserve to be TP, as well as images that deserve to be TP not voted on.

Regards,
Ruben Barbosa

Hi Ruben,
I not submit image 1 as no accompanying detail.

I not submit image 2 as that day there were images I found better and again acquisition data lacking.

Both images quite nice though the starless versions I find  bit odd as not represent  reality.

Sigga
gnomus
02 Dec, 2019 18:52
All submitters will see the 100 plus images.  And yes, every image submitted goes into the queue and is considered for TP/IOTD - with the exception of those folks who specifically exclude their images from competitions.

The ‘community’ approach being suggested has its own set of problems.  IOTD used to be a ‘Likes’ based system, if I remember correctly.  But that led to complaints that people were gaming the system by farming for Likes.  A system that relies on members of the community clicking a button will tend to favour those people with a large number of followers.  It becomes a self-perpetuating group.  The current system does at least give everyone a fair crack of the whip.

Final point.  It is not necessarily the case that someone who gets terribly worked up and vocal about something is always correct in their assessment.
Edited 02 Dec, 2019 18:57
jtrezzo
02 Dec, 2019 18:54
Brad
With the drop in 10 Submitters, did the remaining Submitter queue effectively double?
No, the submitter queue is the same for each submitter and contains EVERY image posted to Astrobin as a feed (unless one has opted out of the IOTD process, and even those are still there with an X over them). Each submitter can submit up to 3 images per day (which go to the review queue, who then submit to the judges for IOTD - these are the ones that become Top Pick).

The effect is that the review queue is halved due to getting less submissions since only half of the submitters participate.
Edited 02 Dec, 2019 18:55
gnomus
02 Dec, 2019 18:58
TimothyTim
Maybe AB should shift some of the responsibility of picking the award pictures a bit to the publisher. For example I have many images I wouldn’t even consider putting them to contention but there are a few I’m happy with I might. This should significantly decrease the work load on the committee.

You can choose to exclude images from competition if you wish to do so.
a028964
02 Dec, 2019 18:59
Steve Milne
The ‘community’ approach being suggested has its own set of problems.  IOTD used to be a ‘Likes’ based system, if I remember correctly.  But that led to complaints that people were gaming the system by farming for Likes.  A system that relies on members of the community clicking a button will tend to favour those people with a large number of followers.  It becomes a self-perpetuating group.

I can see that for sure. Maybe it's not the sole determiner but on the flip side things that should probably be considered may not be.

 
It is not necessarily the case that someone who gets terribly worked up and vocal about something is always correct in their assessment.

Agreed. I'm less about the correctness of said 'assessment'. I feel the demand put on the 'Staff' results in unfair judgement of them and what they are trying to do - basically the impossible. Lose, lose proposition that the assessment  may not account for.  smile
Chris-PA
02 Dec, 2019 18:59
I definitely agree with Tim's suggestion. There are always going to be a few images that we don't give our full effort to and/or just don't turn out as well as we'd hoped. I'd put a at least one third of my recent images in that category and I already know that half of my next 4-6 images will land in that category as well. I don't want a reviewer wasting time and effort on something I've 'mailed in.'

Edit: Steve said that you can exclude individual images. How do we do that? I thought you could only do a blanket exclude all.
Edited 02 Dec, 2019 19:02
gnomus
02 Dec, 2019 19:05
It might be a blanket thing, Chris.  I’m not sure.  And apologies, of course, if I am wrong.  If it isn’t then I assume (on the basis of no technical knowledge whatsoever smile ) that it would be simple enough to add a tick box to exclude images
RRBBarbosa
02 Dec, 2019 19:20
Sigga, Jarrett Trezzo and Richard Sweeney,

I note that there are many issues to be discussed in this forum but just going back to the question that led me to open the request for clarification..

I want to thank Sigga,  Jarret Trezzo and Richard Sweeney for the answer given and at the same time disagree, giving as an example the Cone Nebula image https://www.astrobin.com/420753/, processed with the same data (free public data set they released), containing the same information about the acquired data, but the latter was IOTD (by the way, I got the data capture information on the IOTD image) … so i think there is here a divergent criteria…

However, I am satisfied with your objective answers.

****************

I just did a search on the net and found the exposure times and added in the description field
Edited 02 Dec, 2019 19:36
Chris-PA
02 Dec, 2019 19:20
It would be cool if it were a forced dropdown when uploading an image (similar to the data source and type of image). Like "Do you *really* think this image is worthy?" The workload for the submitters sounds insane to me as an outsider. That needs to be spread around and the people producing the images seem like the people who should get at least some (if not most) of that burden.
TimothyTim
02 Dec, 2019 19:26
Coding wise should be pretty easy, an action button under "Actions" Flag for TP/IOTD consideration.
gnomus
02 Dec, 2019 19:27
Chris Sullivan
It would be cool if it were a forced dropdown when uploading an image (similar to the data source and type of image). Like "Do you *really* think this image is worthy?" The workload for the submitters sounds insane to me as an outsider. That needs to be spread around and the people producing the images seem like the people who should get at least some (if not most) of that burden.

That might dissuade less confident imagers who might have perfectly fine images from putting their images forward.  Plus, don’t you think we already have too many ‘tick boxes’?  I’m starting to feel it’s like submitting one’s tax return. smile
TimothyTim
02 Dec, 2019 19:30
Steve Milne
Chris Sullivan
It would be cool if it were a forced dropdown when uploading an image (similar to the data source and type of image). Like "Do you *really* think this image is worthy?" The workload for the submitters sounds insane to me as an outsider. That needs to be spread around and the people producing the images seem like the people who should get at least some (if not most) of that burden.
That might dissuade less confident imagers who might have perfectly fine images from putting their images forward.  Plus, don’t you think we already have too many ‘tick boxes’?  I’m starting to feel it’s like submitting one’s tax return. smile

Agreed, but that is where comments and likes from the community for an image should persuade someone to post it.
Snjór
02 Dec, 2019 19:42
Ruben Barbosa
Sigga, Jarrett Trezzo and Richard Sweeney,I note that there are many issues to be discussed in this forum but just going back to the question that led me to open the request for clarification..

I want to thank Sigga,  Jarret Trezzo and Richard Sweeney for the answer given and at the same time disagree, giving as an example the Cone Nebula image https://www.astrobin.com/420753/, processed with the same data (free public data set they released), containing the same information about the acquired data, but the latter was IOTD (by the way, I got the data capture information on the IOTD image) … so i think there is here a divergent criteria…

As say Ruben on day I see Cone you did I think there images better. I not sure understand are saying your image is a crop of Stan's or a crop of same data and as his was IOTD yours should be also?

Sigga
chuantian
02 Dec, 2019 19:50
Although I agree with that we need to improve our selection of top picks and IOTD, I wouldn't submit any of the image you mentioned. Please go to look at other Cone Nebula taken with chilescope and you will know the reason. Besides, personally I don't submit starless images unless the image is really outstanding.

As a submitter, I would rather not to submit any image in the submission queue if there are no qualified images. The problem of the reviewers happens on me. Sometimes the images are just far away from good. I'm curious that if IOTD is really the image of the DAY. What happens if there aren't any good image on one day? Will we use past top pick images?

PS:Some times I see different versions of the same outstanding images on the queue, it's hard for me to pick a version. ( ̄▽ ̄)"
Tian Li
Edited 02 Dec, 2019 19:52
Chris-PA
02 Dec, 2019 19:54
Steve Milne
That might dissuade less confident imagers who might have perfectly fine images from putting their images forward. Plus, don’t you think we already have too many ‘tick boxes’? I’m starting to feel it’s like submitting one’s tax return.

I see what you're saying, but I'd still like to have more control over it. I've gotten at least two Top Picks for images that I didn't feel were entirely deserving (NOT THAT I DIDN'T APPRECIATE IT), but also feel like one or two of my better efforts was overlooked for whatever reason. I'd prefer to be able to say 'don't worry about this one' and with it - and much more importantly - lessen the overwhelming workload of the submitters.
2ghouls
02 Dec, 2019 19:56
Peter Goodhew
I vote for keeping it, despite it's limitations. My reasons:1. It's massively motivating and encouraging when an image gets recognised as a TP or IOTD.

2. If I want to take a look at other good images of a target I will select the TPs or IOTDs and use them as a benchmark against which I can assess my own images. This often results in my seeing how I need to improve my images before completing them.

Yes the system isn't perfect, and like Brad and others I can't see how practically it can be made perfect (although as Barry points out there may be some ways of improving it. But for many of us it is still very valuable despite its imperfections.

I think the problem stems from an expectation that it should, or could, be perfect. I think it would help if we could avoid taking it too seriously, and stop getting upset by anomalies or "injustices" when specific images don't get the recognition that they deserve. These are inevitable in a system like this and we need to just learn to live with them imho.

I agree with everything Peter wrote above.

And here are a few additional thoughts as a current submitter:

It is very difficult to develop a volunteer-based system that will satisfy everyone. The IOTD staff are just volunteers who give up a lot of their time to evaluate the images. I am a submitter, when I look at my queue, I look at every image (100+) to try to be fair. Some days , I simply don't have the time to give my fair assessment so I don't submit anything. This makes me feel bad for the reviewers since they will have potentially fewer images to choose from, but some days I just get too busy with work and other commitments. Some days I wish I had never volunteered, but on the whole I think it is a good experience, and anyone who feels strongly about TP/IOTD should volunteer in January and try it from the other side. I worked on some guidelines with other folks this year, but realized they would be used to stir up divisive arguments about images that were "unfairly" picked or not picked. Since that was not my intention with the guidelines, I never finished them.

My three suggestions for improving the process:

1. Figure out a way to make the submitters job less onerous. I would suggest more submitters, and instead of seeing all images submitted to the site, I would give each submitter a random selection of 30, and be asked to pick 0 or 1 per day. Currently, I see about 100 and can pick 0-3 per day. If I had a much shorter queue, I think I could give it the time necessary every day.

2. Develop a SHORT but mandatory training module for incoming IOTD staff that explains the process and gives a few tips on selecting images.  As mentioned above, this is sort of what I was trying to do with the guidelines document, but realized I didn't know how to lead this effort without stirring up controversy and hurt feelings.  I am pretty conflict-adverse so I didn't continue.  I don't think training would ever preclude subjectivity in judging (and I wouldn't want it to). It would just provide a basic philosophy around evaluating astrophotography.  If this never happens, my suggestion for incoming IOTD staff is to watch the presentation Adam Block did for the AstroImaging Channel: Beauty and the Beholder.

3. In January, try to get a mix of different kinds of astrophotographers to volunteer. I feel qualified to evaluate deep sky, and wide-field stuff because I have experience with those. I have a harder time with solar/lunar/planetary images because I only have limited experience.

My 2 cents. I am still on-the-fence about volunteering again.

Nico
Edited 02 Dec, 2019 20:19
TimothyTim
02 Dec, 2019 20:00
李天
Although I agree with that we need to improve our selection of top picks and IOTD, I wouldn't submit any of the image you mentioned. Please go to look at other Cone Nebula taken with chilescope and you will know the reason. Besides, personally I don't submit starless images unless the image is really outstanding.As a submitter, I would rather not to submit any image in the submission queue if there are no qualified images. The problem of the reviewers happens on me. Sometimes the images are just far away from good. I'm curious that if IOTD is really the image of the DAY. What happens if there aren't any good image on one day? Will we use past top pick images?

PS:Some times I see different versions of the same outstanding images on the queue, it's hard for me to pick a version. ( ̄▽ ̄)"
Tian Li

Images taken with remote professional scopes, should generally have an entirely different category and award. As much as I appreciate them, they are given an unfair advantage over people that have put in ALOT of work on their own equipment. The system should be tiered in general with different sub categories. I want to see the best image from acquired hubble data, I want to see the best image taken from a surface professional telescope, I want professional AP image, and even tiered on the amateur level. The path from being a new comer to reaching the level that you have one of your images selected to be IOTD is very long and difficult. Giving someone a nudge of recognition for their hard work would go a long way for keeping them motivated.
Barry-Wilson
02 Dec, 2019 20:33
There is a saying in the UK regarding “not throwing the baby out with the bath water”, meaning that it is an avoidable error in which something good is eliminated or discarded when trying to get rid of something bad or inperfect; similarly there is the expression “don’t let the perfect be the enemy of the good”.

I fear that searching for the perfect system (once again) will risk stirring up rancour (once again) when we have a workable and pragmatic system.

Volunteers will come I’m sure, a tweak to the Submission Queue to display title will help.  More Submitters with lunar, solar and planetary experience is necessary I believe to give more balance; as I note in these discussions over the years that it is in general deep sky imagers raising concerns.

smile
TimothyTim
02 Dec, 2019 20:36
Barry;

As much as I understand where your coming from, the amount of time this subject has been approached and how it always merits heated debate should be a good indicator that the community has out grown the current system.
morefield
02 Dec, 2019 20:42
Brad
Brad
Jarrett Trezzo
First, to answer Ruben's original questions for transparency. I am currently a reviewer and previously a submitter. I did not see these submitted to the review queue.NGC 7000:1. The image has zero no information entered. This is pretty much a disqualification right from the get go for most. Take the time to fill out all the capture details, or at least enter them into the description.2. Bright and common targets like this have hundreds of versions by different people every year. They get tiring, and it's really hard to come across one where you say "wow, that is so much better than all the others." I think a lot of people just get sick of looking at them. In this sense a rare, difficult, or otherwise more interesting target will probably have a better chance at being selected.

Cone Nebula:
1. Again, lacking full information - the exposure detail would be nice - but I think big remote scopes and Hubble data etc. receive less interest (is this the free public data set they released?).
2. Same as above #2.

Brad
And how many current Submitters are there?
Of the original 20, there are 10 that still submit. I'm not sure how many reviewers. Many have just given up or quit in frustration I believe.

Brad
The next level then would be the reviewers … are they overwhelmed.
Absolutely not - we are underwhelmed if anything. Sometimes it's hard to pick from the review queue on a given day either because the submitted images are not that great or there aren't enough to pick from, or a combination thereof.

Wow. So if I were to make a slightly educated guess, we aren't really talking about subjectivity really. I mean it is in play but this is more of a math problem than a subjectivity problem. Right?

With the drop in 10 Submitters, did the remaining Submitter queue effectively double?

Or are there 10 queues with no eyes on them?

No, we all see ALL images submitted to Astrobin.  The queue is just the images cropped to 4:3, two across the screen.  There is no information about the image - just the cropped image.  If something stands out you click to see the details and click again to see full screen and again to see full resolution.

This is best for images that are tightly cropped to the object.  Not good at all for a small discovery that is not a print-worthy pretty picture.

We are allowed to submit up to 3 images in 24 hours, but no requirement to select any.
 
Register or login to create to post a reply.