IOTD and Why It Needs Improvement AstroBin Platform open discussions community forum · Bill Long - Dark Matters Astrophotography · ... · 281 · 8933 · 14

JohnHen 7.78
...
· 
·  2 likes
, I have to wonder how only being seen by half of the Reviewers who got my image is legitimate.


+1
Like
SparkyHT 3.01
...
· 
Still to this day, I wonder how one of my images slipped completely through the process. 88 hours, the entire summer it took to collect the data, in Bortle 6 skies, Canadian wildfire smoke be damned. It failed to even meet TPN status, it would've been nice to have feedback so I don't keep making the same faults in my processing and/or data collection.
Like
Die_Launische_Diva 11.14
...
· 
·  2 likes
Still to this day, I wonder how one of my images slipped completely through the process. 88 hours, the entire summer it took to collect the data, in Bortle 6 skies, Canadian wildfire smoke be damned. It failed to even meet TPN status, it would've been nice to have feedback so I don't keep making the same faults in my processing and/or data collection.

If I spend 88 + 1 hours at the same target using my toy equipment, is there any guarantee that my image will be better than yours?
Like
JohnHen 7.78
...
· 
·  3 likes
Arun H:
here are two improvements that I’d like to propose outside of categories:
remove ALL information, including title, from the image as it moves through the process. Let the image stand completely on its own against other images. We don’t have integration time in the details in the queue. Why should it then appear in the title?



I see where you come from with this argument. On the one side we are reading on IOTD process page that gear etc should not matter but then camera and scope size are listed. How does this fit together?

Anyway, i also want to offer a different point of view: I would in fact welcome if more details would be provided. The imager could describe what difficulties they faced when imaging, what they had in mind to emphasize, how they think their image distinguishes from other similar images etc. Such information could help to "understand" the image.

But currently the image must be judged without any background info. That is like, making a comparison to research, having to judge/review a research paper that purely presents the results without telling how these results have been obtained and how they advance the field. Needless to say that this would be non-acceptable.
Like
cioc_adrian
...
· 
Arun H:
here are two improvements that I’d like to propose outside of categories:
remove ALL information, including title, from the image as it moves through the process. Let the image stand completely on its own against other images. We don’t have integration time in the details in the queue. Why should it then appear in the title?



I see where you come from with this argument. On the one side we are reading on IOTD process page that gear etc should not matter but then camera and scope size are listed. How does this fit together?

Anyway, i also want to offer a different point of view: I would in fact welcome if more details would be provided. The imager could describe what difficulties they faced when imaging, what they had in mind to emphasize, how they think their image distinguishes from other similar images etc. Such information could help to "understand" the image.

But currently the image must be judged without any background info. That is like, making a comparison to research, having to judge/review a research paper that purely presents the results without telling how these results have been obtained and how they advance the field. Needless to say that this would be non-acceptable.

When you have ~23818 images of M42 what can another one bring to the table or be distinguishable?
Like
HegAstro 11.91
...
· 
But currently the image must be judged without any background info. That is like, making a comparison to research, having to judge/review a research paper that purely presents the results without telling how these results have been obtained and how they advance the field. Needless to say that this would be non-acceptable.



I was looking at this more simply. It was determined years ago that integration time should not be made available to the IOTD staff at the submitter and reviewer level. The reason for that is that it is impossible to verify without access to the raw data and, in any case, 40 hours from Bortle 8 is very different than from Bortle 2. That being the case, the information should not be provided or available in a different way, such as in the title. Because, as you say, the natural tendency of humans is to make allowances for quality when the difficulty of doing something is high. Knowing that something is a gigantic mosaic that took six months and heaven knows how much computational power and frustration to complete, I’d be more inclined to forgive poor stars or black clipping. But the current IOTD manifesto makes clear that that cannot be taken into account. In the case of Chad’s image which did not even get a TPN - the submitters had no knowledge of the degree of effort involved. Would their decision have been different if they knew? How do we compare the effort Chad put in to that which Tim did? By the way - I image from a location very close to Chad. So I know how hard it is to get 88 hours of data - especially last year with all the smoke.
Like
siovene
...
· 
·  4 likes
Personally, the way I intended things initially agrees with what you're saying, Arun. I would also prefer that ALL information be hidden.

The titles (to prevent influence by sensationalism and make it more difficult to go and look up images), and the equipment because it should not matter. Integration time is already not shown.

At some point, quite a few years ago, those pieces of information were indeed not available! But I remember that there were some discussions (and some uproar) and I ended up being swayed. I'll be honest tho: I cannot recall what the arguments were.
Edited ...
Like
coolhandjo 1.91
...
· 
·  4 likes
I remember the days where website forum feedback was the only way to know if your image was appreciated as much as you appreciate it. I also recall that for some reason I though all my images deserved praise - but in hind sight they didn't. 

If people want to take the time to review astrophotography in a genuine dedicated system like IOTD then that is much better than nothing at all.

I like it.

One day I was bored and so I gathered a lot of stat's on IOTD TP and RTP. What I found was very consistent themes around equipment integration time and framing. In other words, there was no random bias I could detect overall and it confirmed to me that the system is working on the whole
Edited ...
Like
andreatax 7.56
...
· 
·  1 like
Random bias is an oxymoron.
Like
coolhandjo 1.91
...
· 
·  1 like
andrea tasselli:
Random bias is an oxymoron.

*** Not considering multiple judges. In that case persistent bias would show up where there was no correlation to equipment type and integration size - which would lead to a conclusion of systemic bias. However, my results indicated that the majority of awarded images correlated to equipment type and integration time, with very little deviation (or, random bias in the data results). This meant even a few judges "randomly" being bias without regard to equipment and integration time was not persistent.    ***
Like
HegAstro 11.91
...
· 
Coolhandjo:
In other words, there was no random bias I could detect overall and it confirmed to me that the system is working on the whole


I agree with you that there is no systematic bias. I cannot speak for others, but I never brought up the issue of the judges being biased. However, the lack of conscious bias on a large scale does not mean the system has no room for improvement. Such areas of improvement as we have been debating, at least in the later part of this thread, have been related to improved feedback on images, and improved ways of highlighting certain types of work. That has nothing to do with bias. Of course, there will always be discussion and debate on specific images, but that does not lend itself to analysis using statistics.
Like
Die_Launische_Diva 11.14
...
· 
·  2 likes
I believe that providing feedback for the IOTD process which should come to a decision in such a short amount of time is infeasible. Someone, especially with the qualifications of the Judges, has his limits on what services he can provide pro bono.

Members are free to ask for constructive criticism via the RCC section of the forum. For what I've seen so far, many members are open to criticism and have received enough aid.
Edited ...
Like
JohnHen 7.78
...
· 
·  4 likes
Arun H:
have the judge that promoted an image write a short paragraph, anonymously ( or they can include their name if they wish) as to why they chose to advance that image. They don’t have to defend why they didn’t choose some other image - just what they liked about this one.
QUOTE




Another great idea with a great learning effect. After all, the judges are very experienced and it would be great to participate in judges' experiences by providing info on an IOTD:
- intro of the object itself
- image: techniques used and what makes it unique/distinct etc.
This could be just a single sentence or a paragraph (judge can choose how much to write).
I believe this could be doable by judges because they promote in average one(?) image per week. Would be great to hear specifically from judges what they think of this idea and whether it would be feasible and enhance the overall value of IOTD.
Thx.
Edited ...
Like
profbriannz 16.18
...
· 
·  3 likes
Another great idea with a great learning effect. After all, the judges are very experienced and it would be great to participate in judges' experiences by providing info on an IOTD:
- intro of the object itself
- image: techniques used and what makes it unique/distinct etc.
This could be just a single sentence or a paragraph (judge can choose how much to write).
I believe this could be doable by judges because they promote in average one(?) image per week. Would be great to hear specifically from judges what they think of this idea and whether it would be feasible and enhance the overall value of IOTD.
Thx.



I think this is a great suggestion.  IOTD is based on subjective assessment, but it would be great to have a little guidance on what that subjective assessment is.

This would help me personally, as I have 49 images which have (or will be)  Nominations for Top Picks or Top Picks.   But no IOTD as yet.  From a casual inspection of the astrophotographers list on AB, I think there is only one other person on AB who has more NFTPs and TPs  without ever getting an IOTD.

Don't get me wrong, I am delighted to have received this much recognition from my peers.  But it leaves me wondering how can I improve my images to take them to the next level.  It may be perceived as shallow, but getting an IOTD one day would mean a lot to me.  However, I don't know in which direction to go.  The commentary on each IOTD pick would certainly help that. 

[As a side note, the selection from NFTP -> TP -> IOTD is clearly non-random - a quick statistical test of the distribution tells you that.  So there are clearly things that appeal more than others to the judges.]
Edited ...
Like
messierman3000 4.02
...
· 
have the judge that promoted an image write a short paragraph, anonymously ( or they can include their name if they wish) as to why they chose to advance that image. They don’t have to defend why they didn’t choose some other image - just what they liked about this one.


That would be cool

+1 for me
Like
jeffbax 13.12
...
· 
·  7 likes
Hi my friends,

Talks are warmer here.

I am a judge and I would agree to share with the authors of each image I chose on what Made me push their work.

But. A lot of concerns are on the judges here... I must say that every 2 days, I see an excceptionnal image that even did not reach TPN ! 

Ok, non injuries...

This is an extraordinary international ranking process about astrophotography. It has flaws, Indeed, but enjoy, your turn will come if you work...

JF
Like
HegAstro 11.91
...
· 
·  2 likes
Jeffbax Velocicaptor:
Hi my friends,

Talks are warmer here.

I am a judge and I would agree to share with the authors of each image I chose on what Made me push their work.

But. A lot of concerns are on the judges here... I must say that every 2 days, I see an excceptionnal image that even did not reach TPN ! 

Ok, non injuries...

This is an extraordinary international ranking process about astrophotography. It has flaws, Indeed, but enjoy, your turn will come if you work...

JF

Hi Jeff, 

I agree with you that there are many images that are great that do not reach TPN level. That is a more significant problem to solve - but many people do not even agree it is a problem  So my thinking is - let us make some smaller changes that add to clarity and improve the usefulness of IOTD. Hence the suggestion of the judge stating what they saw about the image that gave them reason to advance it. My thought is - it should be shared with the community and not just the authors, similar to the writeup below APOD images.

Since this can be a controversial and emotional topic - the judge can choose to stay anonymous if they like .

I do not speak for others, but I will be honest in saying that my contributions here in this thread are not because I someday want an IOTD - in fact I am perfectly content if I never receive one - but because I like this community and want to see us recognize many types of images rather than create walls and barriers.


Arun
Edited ...
Like
profbriannz 16.18
...
· 
Jeffbax Velocicaptor:
Hi my friends,

Talks are warmer here.

I am a judge and I would agree to share with the authors of each image I chose on what Made me push their work.

But. A lot of concerns are on the judges here... I must say that every 2 days, I see an excceptionnal image that even did not reach TPN ! 

Ok, non injuries...

This is an extraordinary international ranking process about astrophotography. It has flaws, Indeed, but enjoy, your turn will come if you work...

JF



Thanks, I am working so I do hope I get there! 

Also agree about the TPN category - another facet of the subjective process -and sheer number of images going onto AB].  I don't know if it might reassure folks, but I won 2nd prize in NZ's 2022 national AP competition for an image that failed to make TPN.  

There are just so many great images out there - and it what makes AB the site it is.
Like
rockstarbill 11.02
...
· 
·  1 like
Salvatore Iovene:
I would also prefer that ALL information be hidden.




I have made this point here in this thread and directly to you countless times. The fact that this is not a completely blind voting process is a serious red flag. 

Nonetheless, I have seen more diversity in the images that have been selected over the course of the time since the thread was posted - so it seems at least some of the feedback was taken to heart.

-Bill
Like
kv54 1.81
...
· 
·  1 like
Hi all,

Let  me summarise the discussion from my point of view:

- IOTD is based on subjective judgement of judges who may prefer certain type of objects
- No democratic process
- Technical mistakes of contributors are often overlooked (I can send examples if needed)
- Benefit for "average" paying members is limited

Based on this, I would strongly recommend to stop IOTD! Likes tell what the community wants to know!

BR,

Klaus
Edited ...
Like
siovene
...
· 
·  7 likes
Hey,

I would like to quickly send another message to you all to thank you again for your investment, time, interest and feedback, and address a couple of the suggestions.

Regarding adding the opportunity for the judges to add a note to an IOTD when they make one, I could relatively easily add this feature. I agree that it wouldn't be too much work, as one judge can appoint only one IOTD per week anyway. I suspect it might not be very useful, because IOTDs are great images and what's a judge going to say that's not repetetive, or that the photogrpaher doesn't already know? It will often be just compliments and I don't know if an IOTD needs that :-)

Frankly speaking, picking an IOTD is very difficult. Most of the images in the Judges' queue are fantastic and could just as well be IOTD. Having  judge add a comment like "Fantastic work, great composition, hard subject to image, great processing" might also make other feel like "My image has these properties too but it wasn't selected, what gives?!?"

Personally, if it was my picture, I'd be more interested in constructive feedback on how to improve. But given that this isn't possible in the context of how the IOTD selection operates, this requires a different model and a different set of volunteers (something open to the whole community). I think that such a thing would fit perfectly with AstroBin's principles and why I created this website, so it's definitely something I want to prioritize.

Prioritizing means focusing my efforts and not being distracted by a million smaller features that provide little impact like allowing judges to express compliments (which they can simply do in the form of a comment below the image they have selected).

Another suggestion I wanted to address was the one about reserving some slots to certain kinds of images. This incurs in the risk of not being able to use one's slots efficiently as the types of images being submitted are different. For instance, dedicating certain slots to galaxies and certain to nebulae would mean that things be adjusted as the submissions change during the year (galaxy season, for instance). Doing the same but for solar system objects causes the same problem in case of spikes such as for an eclipse.

Moreover, what problem is this trying to solve? Percentages of IOTD per subject types are already pretty aligned with percentages of images per subject types being submitted, so enforcing quotas goes to solve a problem we don't actually have.

Not to mention the fact that this is not something that can be implemented easily, and has a high risk of introducing issues that are hard to solve on a live system.

Any sort of split by subject type, to create multiple IOTD/TP paths, is just another way of saying "we want more awards", or "we want it to be easier to get an award". You achieve the same by changing the number of slots and the size of the team, without actually going and double or triple the number of IOTD and open the gates for everyone to ask yet another category.

As a matter of fact, now that my latest changes to the process (more Reviewer slots and a few more reviewers) we're now getting 2x as many Top Picks on a daily basis, which I'm hoping is satisfying some of you.

Thank you again for your feedback and clear skies!
Salvatore
Like
jeffbax 13.12
...
· 
·  2 likes
Salvatore Iovene:
As a matter of fact, now that my latest changes to the process (more Reviewer slots and a few more reviewers) we're now getting 2x as many Top Picks on a daily basis, which I'm hoping is satisfying some of you.


Thank you for this Salva, the judges' queue is much more consistent now, with more variety

JF
Like
rockstarbill 11.02
...
· 
·  1 like
It happened again.

06 April 2024 - Chamaeleon Molecular Complex (MrSpaceman) - Full resolution | AstroBin

No disrepect to the imager, but having two tones of stars in that very large field (those being neon orange and neon blue) is not a trait nor an attribute of a great image.

It is an image with a lot of dust. 

You folks in the IOTD cannot get blind over technical matters and just fall over like this over dust. It makes this whole process look terrible!

-Bill
Like
andreatax 7.56
...
· 
·  1 like
Bill Long - Dark Matters Astrophotography:
No disrepect to the imager, but having two tones of stars in that very large field (those being neon orange and neon blue) is not a trait nor an attribute of a great image.


I couldn't agree more and that wouldn't be the first instance either.
Like
Overcast_Observatory 20.43
...
· 
·  2 likes
Bill Long - Dark Matters Astrophotography:
It happened again.

06 April 2024 - Chamaeleon Molecular Complex (MrSpaceman) - Full resolution | AstroBin

No disrepect to the imager, but having two tones of stars in that very large field (those being neon orange and neon blue) is not a trait nor an attribute of a great image.

It is an image with a lot of dust. 

You folks in the IOTD cannot get blind over technical matters and just fall over like this over dust. It makes this whole process look terrible!

-Bill



I was hopeful that the process was being more critically applied and IMO there have been big improvements in the images over the last few weeks from both an image quality and diversity stand point.  A lot of things in this image could be fixed in processing... like the severe coma, defocus from field curvature, star color, etc... 

I've learned a lot about how IOTD is handled over the last few weeks.  If I understand it correctly, there does not need to be any consensus among judges for the final pick right?  It's just one judge that gets to decide?  Maybe that is a part of the process that could be improved upon?  Would requiring two or three judges to agree on the IOTD pick help this process? Do some images slip through because they are looked at on cell phones and not computer screens?

On my phone, this image looks pretty good. Nice huge field, lots of contrast to pull the dust out.  Once I view on my computer it really falls apart.  Congrats to the imager, I'm sure he is stoked to be recognized and is encouraged.  It would be interesting to have the judge who selected this to IOTD give feedback though.  And please don't take this post as attacking the judges or the imager... genuinely trying to understand things here and see Astrobin and IOTD improve!  Hope the conversation can continue.  Take a close look at this image...
Edited ...
Like
 
Register or login to create to post a reply.