Cookie consent

AstroBin saves small pieces of text information (cookies) on your device in order to deliver better content and for statistical purposes. You can disable the usage of cookies by changing the settings of your browser. By browsing AstroBin without changing the browser settings, you grant us permission to store that information on your device.

I agree

TOP PICK clarification request

morefield
02 Dec, 2019 20:45
TimothyTim
李天
Although I agree with that we need to improve our selection of top picks and IOTD, I wouldn't submit any of the image you mentioned. Please go to look at other Cone Nebula taken with chilescope and you will know the reason. Besides, personally I don't submit starless images unless the image is really outstanding.As a submitter, I would rather not to submit any image in the submission queue if there are no qualified images. The problem of the reviewers happens on me. Sometimes the images are just far away from good. I'm curious that if IOTD is really the image of the DAY. What happens if there aren't any good image on one day? Will we use past top pick images?PS:Some times I see different versions of the same outstanding images on the queue, it's hard for me to pick a version. ( ̄▽ ̄)"
Tian Li

Images taken with remote professional scopes, should generally have an entirely different category and award. As much as I appreciate them, they are given an unfair advantage over people that have put in ALOT of work on their own equipment. The system should be tiered in general with different sub categories. I want to see the best image from acquired hubble data, I want to see the best image taken from a surface professional telescope, I want professional AP image, and even tiered on the amateur level. The path from being a new comer to reaching the level that you have one of your images selected to be IOTD is very long and difficult. Giving someone a nudge of recognition for their hard work would go a long way for keeping them motivated.

I actually think that professionally acquired data is at a great disadvantage currently.  We see very few Top Picks from that data, in part because I think there is a much higher standard being looked for with that data.  This standard is not explicit in any way, but each submitter/approver is applying some sort of handicapping based on the equipment and level of difficulty.
morefield
02 Dec, 2019 20:52
Chris Sullivan
Yikes - 100 a day? Maybe I'm not qualified to be a submitter after all. I sometimes go weeks without checking my feed. Does this mean that every image published is reviewed this way? I've never been 'behind the scenes,' but from an outsider perspective, I feel there has to be a less overwhelming way. No idea what that would be though (other than more volunteers or perhaps some non-queue method???). .

I've thought about this while reviewing.  In order to give significantly more attention to each potential image there would simply have to be fewer images to review.  The only way I can think of for that to happen would be for images to be "opt-in" submitted by the original poster.  I would assume that most people posting an image would not feel that it is a potential IOTD or Top Pick.  Of course, I could be wrong on that point.

I think any attempt to use like counts to push an image to the reviewers would be fraught with a myriad of issues.
TimothyTim
02 Dec, 2019 20:54
Kevin;

I think you hit the nail on the head right there with a couple of things:

1. Commonly imaged objects vs. unique and rare objects: Yes reviewers are seeing ALOT of NGC7000, M45, Orion, Horsehead…etc. These objects are imaged constantly, and can be tiring for reviewers, but the fact is it was for example an entirely different process for me to image these objects with a refractor vs. a hyperstar and an entirely different technique. =

2. One method should not be advantageous over the other for IOTD or any other award consideration. They should be separate, each in it's own category competing with images of similar category. This doesn't mean that you guys should have the Image of the Day being done on daily basis, heck it can be on weekly basis at that point.
Chris-PA
02 Dec, 2019 21:08
Kevin Morefield
I've thought about this while reviewing. In order to give significantly more attention to each potential image there would simply have to be fewer images to review. The only way I can think of for that to happen would be for images to be "opt-in" submitted by the original poster. I would assume that most people posting an image would not feel that it is a potential IOTD or Top Pick. Of course, I could be wrong on that point.I think any attempt to use like counts to push an image to the reviewers would be fraught with a myriad of issues.
I wouldn't worry about the former - just take a look at some of the images selected as an AAPODx2 (RIP). Some were completely unworthy (including images from yours truly) and I'd even go so far as to say that other picks were outright terrible. And you had to make an effort and actively submit your images. I don't think this requirement would put too many worthy images out of the running.

I *really* don't think that the number of likes is a good way to judge an image's worthiness, but I feel that bookmarks are different - they're given much more sparingly and usually with good reason. I have an image that is definitely NOT Top Pick worthy that has over 130 likes and only 1 bookmark. That's a pretty terrible likes to bookmarks ratio, but honestly that image was totally mailed in and half the integration time I usually put into images. It's clearly not worthy and the number of bookmarks speaks to this. Meanwhile, Mark Stiles' image I posted about way up this thread currently has 87 likes and 19 bookmarks (and without the advantage of being a Top Pick) - with good reason! That should indicate to someone 'hey - we've got something special here'! I mean, that's a 4.5 to 1 ratio (vs. my 130 to 1 ratio). I feel like that might be a good method of preventing such IOTD worthy images from be passed over going forward.
gnomus
02 Dec, 2019 21:22
Chris Sullivan
… I *really* don't think that the number of likes is a good way to judge an image's worthiness, but I feel that bookmarks are different - they're given much more sparingly and usually with good reason….

I would usually bookmark something if it's an object that I might want to image in the future.  I don't need to bookmark the 'well known hits' like M45, M42, etc.  So I end up bookmarking the more obscure stuff.  Whether or not I bookmark has no bearing on what I think of the image.  It could be a poor image of an interesting object.
morefield
02 Dec, 2019 21:27
Sorry to barge in on your work day, Chris…

What if every one on Astrobin were a submitter - but with a limited number of submits.  Say 10 per week.  And each submission required a sentence of description as to why they feel it is worthy.  Or maybe a set of dropbox answers to alert the submitter panel as to what to look for.  For example, uniqueness of the target, quality of processing, degree of difficulty, etc.

I think that would be feasible.  It would certainly lower the number of images to review from every image to something more manageable.  It might also give us a more democratic peak into what interests the user-base at large.
2ghouls
02 Dec, 2019 21:31
Kevin Morefield
Sorry to barge in on your work day, Chris…What if every one on Astrobin were a submitter - but with a limited number of submits.  Say 10 per week.  And each submission required a sentence of description as to why they feel it is worthy.  Or maybe a set of dropbox answers to alert the submitter panel as to what to look for.  For example, uniqueness of the target, quality of processing, degree of difficulty, etc.

I think that would be feasible.  It would certainly lower the number of images to review from every image to something more manageable.  It might also give us a more democratic peak into what interests the user-base at large.

I like this idea Kevin.

Cheers, Nico
Chris-PA
02 Dec, 2019 21:37
Steve Milne
I would usually bookmark something if it's an object that I might want to image in the future. I don't need to bookmark the 'well known hits' like M45, M42, etc. So I end up bookmarking the more obscure stuff. Whether or not I bookmark has no bearing on what I think of the image. It could be a poor image of an interesting object.

All fair points. I usually use bookmarks the same way, but I will bookmark outstanding examples of the well-known stuff (and I think most users do this as well). Actually, I'm kind of astounded at the number of Crescents and M20s I've bookmarked… Again, my suggestion is not that it automatically makes an image a Top Pick, but rather that it pushes an image beyond the submitter stage for a reviewer to look at - i.e., that it serves as a backup in case submitters missed an outstanding image for whatever reason. I don't think likes can reveal this, but I do suspect that bookmarks can. I'll bet if Salvatore produced a list of most bookmarked images that weren't selected as Top Picks or IOTD, we'd have a nice selection of images that were absolutely worthy - perhaps that's where George's suggestion of a special accolade could first apply.
Chris-PA
02 Dec, 2019 21:38
Kevin Morefield
Sorry to barge in on your work day, Chris…What if every one on Astrobin were a submitter - but with a limited number of submits. Say 10 per week. And each submission required a sentence of description as to why they feel it is worthy. Or maybe a set of dropbox answers to alert the submitter panel as to what to look for. For example, uniqueness of the target, quality of processing, degree of difficulty, etc.

I think that would be feasible. It would certainly lower the number of images to review from every image to something more manageable. It might also give us a more democratic peak into what interests the user-base at large.
I love this idea - I'm 100% on board for this.
a028964
02 Dec, 2019 21:45
Kevin Morefield
Sorry to barge in on your work day, Chris…What if every one on Astrobin were a submitter - but with a limited number of submits.  Say 10 per week.  And each submission required a sentence of description as to why they feel it is worthy.  Or maybe a set of dropbox answers to alert the submitter panel as to what to look for.  For example, uniqueness of the target, quality of processing, degree of difficulty, etc.

I think that would be feasible.  It would certainly lower the number of images to review from every image to something more manageable.  It might also give us a more democratic peak into what interests the user-base at large.

I think this is a good idea. So this would be submitting to the Submitters or directly to the Reviewers?
morefield
02 Dec, 2019 21:50
Brad
Kevin Morefield
Sorry to barge in on your work day, Chris…What if every one on Astrobin were a submitter - but with a limited number of submits.  Say 10 per week.  And each submission required a sentence of description as to why they feel it is worthy.  Or maybe a set of dropbox answers to alert the submitter panel as to what to look for.  For example, uniqueness of the target, quality of processing, degree of difficulty, etc.I think that would be feasible.  It would certainly lower the number of images to review from every image to something more manageable.  It might also give us a more democratic peak into what interests the user-base at large.

I think this is a good idea. So this would be submitting to the Submitters or directly to the Reviewers?

Initially, I’d set it to fill the current submitters queue.  If the volume was manageable enough they could go direct to reviewers.
a028964
02 Dec, 2019 21:59
Kevin Morefield
Brad
Kevin Morefield
Sorry to barge in on your work day, Chris…What if every one on Astrobin were a submitter - but with a limited number of submits.  Say 10 per week.  And each submission required a sentence of description as to why they feel it is worthy.  Or maybe a set of dropbox answers to alert the submitter panel as to what to look for.  For example, uniqueness of the target, quality of processing, degree of difficulty, etc.I think that would be feasible.  It would certainly lower the number of images to review from every image to something more manageable.  It might also give us a more democratic peak into what interests the user-base at large.
I think this is a good idea. So this would be submitting to the Submitters or directly to the Reviewers?

Initially, I’d set it to fill the current submitters queue.  If the volume was manageable enough they could go direct to reviewers.

Agree it should drive submitter queue. 10 a week times number of users to me is still exponential :-) AND I would move to make the image submitter explicitly indicate that the image be considered for awards or not. This latter may naturally reduce the size of the queues …
Edited 02 Dec, 2019 22:01
whwang
03 Dec, 2019 02:40
Hi Kevin,

Are you proposing that the crowd "nomination" completely replaces the current situation that every image goes to the submission queue?  I think this can work, except for one point: the requirement of putting a sentence can put away some users whose native language is not English.  I think this can be made optional.  Most images that deserve TP/IOTD already have multiple comments that point out in what way the images stand out.
Edited 03 Dec, 2019 02:41
morefield
03 Dec, 2019 03:01
Wei-Hao Wang
Hi Kevin,Are you proposing that the crowd "nomination" completely replaces the current situation that every image goes to the submission queue?  I think this can work, except for one point: the requirement of putting a sentence can put away some users whose native language is not English.  I think this can be made optional.  Most images that deserve TP/IOTD already have multiple comments that point out in what way the images stand out.

If not a sentence, maybe just some selections to indicate why they are nominating.  I think it will help direct the attention of those reviewing the queue.
gnomus
03 Dec, 2019 07:06
I have my doubts about how enthusiastically the 'crowd' might engage in such a system, especially if there is a requirement for (even modest) form-filling.  The volunteer positions were not filled this time around - which is one measure of 'enthusiasm'.  And, although there has been quite a bit of activity relating to this thread, the number of posters has been relatively limited (and we are the 'usual suspects' smile ).  I would imagine that IOTD is not a major issue for most users: they will take a peek,  enjoy the image (or not), and move on without it giving them too much cause for contemplation (or distress).

If a community-nomination system was introduced, I'm not clear how this would work.  Would the submitters queue be made up solely of images nominated by the crowd?  If so, I'm not sure I see the need for the three levels of supervision we have at present.  Would one 'vote' be sufficient to get an image into a queue?  And if it is to be a crowd-nomination system, how do we address the issues that arise with a such a system?  These include (but are not limited to): people trying to game the system (as they did before); or the fact that people with a large number of followers will have a significant advantage over people with fewer followers (and new users, of course).  (We'd end up with yet more 'Likes' threads, or their equivalents!)

For all it's downsides, the current system does at least compel submitters to look at ALL new images on an equal footing - giving those with fewer followers a fairer crack of the whip.

I do remember that the old system generated quite a lot of discord too - with folks deleting their accounts (albeit temporarily in some cases) and so forth.  There is a danger of going around in circles.
Andys_Astropix
03 Dec, 2019 07:17
Kevin Morefield
What if every one on Astrobin were a submitter - but with a limited number of submits.  Say 10 per week.  And each submission required a sentence of description as to why they feel it is worthy.  Or maybe a set of dropbox answers to alert the submitter panel as to what to look for.  For example, uniqueness of the target, quality of processing, degree of difficulty, etc.

Love the passion & some very interesting ideas here!  with some fine tuning they could help ease the submitters load. However, to ensure a quality field, May I suggest the following…

a) that an opt in button be available for photographers who want their images considered for Iotd/TP selection, and data fields are mandatory if they choose to enter

b) anyone can submit (except reviewers & judges) but only If they have received a TP/IOTD can submit from the posted image pool

c) reviewers need have at least two TP’s (or IOTD’s) to qualify and that an image needs a minimum of two reviewers votes to proceed to TP.

d) judges need to have at least one Iotd &  two TP’s to qualify and an image requires three judges votes to get IOTD.

e) judges can enter and are eligible for TP/Iotd- (but obviously they cannot vote for their own image)

if we have a dedicated & willing  team we won’t need as many people in these respective roles and there’s a clear pathway to advance through the system.

Ie: entrants can become submitters once they get a TP
- submitters can become reviewers upon gaining two TP’s
-reviewers can become judges once they’ve received an IOTD

This should keep everyone happy!

cheers
Andy
Edited 03 Dec, 2019 07:48
chuantian
03 Dec, 2019 07:53
TimothyTim
李天
Although I agree with that we need to improve our selection of top picks and IOTD, I wouldn't submit any of the image you mentioned. Please go to look at other Cone Nebula taken with chilescope and you will know the reason. Besides, personally I don't submit starless images unless the image is really outstanding.As a submitter, I would rather not to submit any image in the submission queue if there are no qualified images. The problem of the reviewers happens on me. Sometimes the images are just far away from good. I'm curious that if IOTD is really the image of the DAY. What happens if there aren't any good image on one day? Will we use past top pick images?PS:Some times I see different versions of the same outstanding images on the queue, it's hard for me to pick a version. ( ̄▽ ̄)"
Tian Li

Images taken with remote professional scopes, should generally have an entirely different category and award. As much as I appreciate them, they are given an unfair advantage over people that have put in ALOT of work on their own equipment. The system should be tiered in general with different sub categories. I want to see the best image from acquired hubble data, I want to see the best image taken from a surface professional telescope, I want professional AP image, and even tiered on the amateur level. The path from being a new comer to reaching the level that you have one of your images selected to be IOTD is very long and difficult. Giving someone a nudge of recognition for their hard work would go a long way for keeping them motivated.

Actually I've seen someone with better scope than chilescope. But your idea is good, even APOY has a remote telescope group.
As for my judging criteria, I think high standerd is the only way to make sure that IOTD has a good quality.
2ghouls
03 Dec, 2019 09:50
Steve Milne
Would one 'vote' be sufficient to get an image into a queue?  And if it is to be a crowd-nomination system, how do we address the issues that arise with a such a system?  These include (but are not limited to): people trying to game the system (as they did before); or the fact that people with a large number of followers will have a significant advantage over people with fewer followers (and new users, of course).  (We'd end up with yet more 'Likes' threads, or their equivalents!)For all it's downsides, the current system does at least compel submitters to look at ALL new images on an equal footing - giving those with fewer followers a fairer crack of the whip.
I don’t want to speak for Kevin since this was his idea, but the way I imagined it was that yes one vote would put an image in to the submitter’s queue and folks could self-submit. If this were the case, and it wasn’t based on number of votes, I don’t see how the issues you mention would be used to anyone’s unfair advantage. The upside to this system is that we would only be seeing images in the submitters queue that someone out there thinks should be considered for a TP/IOTD.  I also really like the idea of a 1-question form asking why they think a photo is worthy, because it will make people think about their choice before submitting. I agree that anything beyond 1 question would severely decrease participation because most people don’t care.

As for current submitters looking at (actually clicking on and examining) every image. I try to, yes, but I’m not sure if the other submitters do that, and I was given no guidance when I signed up that this was expected. I think many make initial judgements based on the little cropped thumbnails since there are so many images, and it has been discussed many times the issues that arise because of that.
tolgagumus
03 Dec, 2019 10:50
Ruben Barbosa
Hi everybody,I have a question for AB submitters and reviweres.

For the sake of transparency of the image selection criteria (Top Picks / IOTD), I would like a short clarification.

Recently I posted 2 images, NGC 7000 (https://www.astrobin.com/kqgqyl) and The Cone Nebula (https://www.astrobin.com/euetw9/), both with almost 200 likes, several comments where members liked the processing, the detail, the color, the resolution, etc.

As I share the same sentiment from the members who commented the images, and find that the AB jury had a different opinion, then I would like to be clarified on the following questions:

1. Did submitters vote for these images? If not, I would like to know why the two images do not deserve to be TP?

2. Did reviewers vote for these images? If not, I would like to know why the two images do not deserve to be TP?

I think no one is offended by this request for clarification, but I often see TP images that do not deserve to be TP, as well as images that deserve to be TP not voted on.

Regards,
Ruben Barbosa

Hi Ruben,

I don't normally do this but since you asked I will. I am a judge not a reviewer or submitter but if these two images came to TPs I would not have picked them as IOTD and here is why. On your NGC7000 the background is too noisy and the bright areas lack dynamic range. On the second image Cone Nebula from ChileScope, in general it's excellent until you take a closer look where there is a horizontal line across the entire image. We have a loose guideline thanks to some of the previous judges and we tend to look for excellence when it comes to professionally acquired data unless it's a very unique target.

Sometimes people get a lot of likes and comments because they have many followers, sometimes it's because they own the same camera etc. This is why we moved away from selecting images based on number of likes and now we have a process. 3 independent people has to select an image to become IOTD.

Now has there been times where images were overlooked? Of course we are all human and can make mistakes.
gnomus
03 Dec, 2019 13:02
Nico Carver
I don’t want to speak for Kevin since this was his idea, but the way I imagined it was that yes one vote would put an image in to the submitter’s queue and folks could self-submit. If this were the case, and it wasn’t based on number of votes, I don’t see how the issues you mention would be used to anyone’s unfair advantage….

I see.  So what would end up in the submitters’ queue would be all images that are self-submitted, along with images that the photographer did not want to submit but were submitted anyway by another user who saw and liked the image?  Should we be able to over-ride the wishes of the copyright holder in this way?

I would prefer an ‘opt out’, rather than an ‘opt in’ system.  That is because an ‘opt in’ system discriminates in favour of the pushy, narcissistic type of personality versus the shy, retiring type.   ‘Opt out’ is much more passive than ‘opt in’.
a028964
03 Dec, 2019 13:12
Steve Milne
I would prefer an ‘opt out’, rather than an ‘opt in’ system.  That is because an ‘opt in’ system discriminates in favour of the pushy, narcissistic type of personality versus the shy, retiring type.   ‘Opt out’ is much more passive than ‘opt in’.

You're absolutely right. However, what about the perspective of the 'Submitter'? Just saying. If the system is to be as fair as possible to all, then how do you ensure that the Submitter job is manageable and meaningful enough to 'perform' it? Overwhelming a Submitter is equally unfair the them and the imager, shy or narcissistic.

I work with data for a living, about 1.3 million new data points daily. If I had to hire or rely entirely on people to look at all of those each day to find the TP (or bad data in this case), they won't work at it long before they get demotivated and leave. To make their role valuable (or motivational) is to find the things that are interesting for them and let them do their job.
Edited 03 Dec, 2019 13:14
gnomus
03 Dec, 2019 13:26
The issue of the submitters being overwhelmed is a major one.  However, I’m not convinced that opt in or opt out would cut down the volume all that much (we might cut out some of the ‘equipment’ photos).

Certainly, however, we need more submitters looking at fewer images.  What if images hung around in the queue for less time?  Or perhaps, the submitters should be split into two or more groups (depending how many volunteers we get) looking at only a portion of the total.

And it goes without saying that subjecting these guys and gals to (often unwarranted) criticism is unlikely to help.
Edited 03 Dec, 2019 13:33
Die_Launische_Diva
03 Dec, 2019 13:45
Steve Milne
The issue of the submitters being overwhelmed is a major one.  However, I’m not convinced that opt in or opt out would cut down the volume all that much (we might cut out some of the ‘equipment’ photos).Certainly, however, we need more submitters looking at fewer images.  What if images hung around in the queue for less time?  Or perhaps, the submitters should be split into two or more groups (depending how many volunteers we get) looking at only a portion of the total.

And it goes without saying that subjecting these guys and gals to (often unwarranted) criticism is unlikely to help.
I agree 100%.

Regarding the submitters workload,  what if each submitter is presented with only a random subset of the available images of a given time frame? This will reduce their workload, will increase the probability of each image to be seen and evaluated, and probably it is very easy to implement given the current infrastructure.

Now, what I can do? I can volunteer for a submitter for the next year. Offering a hand of help is better than dropping a critique and then running away smile
pricech44
03 Dec, 2019 14:17
Please everyone stop with the “ my image should have got “ everyone whining about the top pick or IOTD. I for one am not here for recognition! I am here because I love doing this if your here to get a metal and that’s the only way you judge your own work then maybe you should get another hobby. I have been on AB for Five years and never got any recognition for any of my images. I could care less I get a lot more then that just learning from others and trying to get the best out of my equipment as I can.  Don’t judge yourself on being recognized be proud of yourselves for being able to do this most unusual hobby. I think from the lowest level of images to highest all deserve a round of applause.

Enjoy what we are doing and don’t do it just to get a reward.

Chris
Edited 03 Dec, 2019 14:18
gnomus
03 Dec, 2019 15:00
Chris Price
Please everyone stop with the “ my image should have got “ everyone whining about the top pick or IOTD….

+++++
 
Register or login to create to post a reply.