# 19 Feb, 2019 23:19
I am a recent judge and I have a question to ask you all. Perhaps we can do a poll to answer this question.Currently, although it's not a rule, it's the common practice among the judges to skip over a target if it has been selected recently. For instance if the Rosette was IOTD yesterday, judges will not select another Rosette even though it may be the best image in the queue of Top Picks. This forces us to select the next best one even though the Rosette was the best. The current consensus is that if there is a second image, it should be exceptional. Th explanation is AB members get upset when the same image is selected. The problem I see is when it's the season for a target, there will be many examples of it. So if one member quickly puts one up and gets an IOTD, the rest are not likely to get an IOTD. Same applies to an individual. Should the same person get 2 IOTDs in a row or a week apart. What if the member uploads 3 jaw dropping images the same day. Are we to ignore them because he already got an IOTD?
I guess my question really is are we picking the best image available to us or are we picking the image that makes members happy?
# 20 Feb, 2019 00:39
|It's not difficult to solve this problem. The iotd is to select from the material of the day instead of releasing photos a few days later.thanks|
# 20 Feb, 2019 01:31
Great questions, and ones I have already considered myself in the time since I have started enjoying Astrobin.
After going around on these questions a few times, I have landed in favor of what I think is the current policy. I don't mind having the same object as the target of a second IOTD soon after the first, but it should clearly be a step above any of the other images being considered for that day. I also don't mind an individual receiving multiple IOTDs in a short timespan, but I think that naturally, with each additional award for that same individual, the bar gets raised a bit higher for the next time.
Your last question is the most interesting. I don't agree with either definition you have offered. Some time ago, Michele Vonci posted this statement on the forum:
"I believe astrophotography is about the challenge of getting the best image with the actual conditions of the photographer (gear/location/light pollution etc.), not the best photography in absolute. For the best pictures we have already the Hubble telescope."
I would add that astrophotography can also be about capturing an object which hasn't been seen much before, or a unique perspective of a more common object. The IOTD should encompass all of this. Since the process was reworked last year, I have no complaints about the IOTD system. I enjoy seeing the IOTD image each day, especially if I recognize the imager and I know how much work that person put into the final product.
# 20 Feb, 2019 03:57
^ +1 above - well said Gary!|
Having enjoyed judging IOTD for the past year, I'm in agreement with Gary & Tolga's thoughts.
We were crucified by the members last year when we selected the same subject twice in the same week - and I have since tried very hard to ensure that the same target didn't appear again in the same week. Occasionally, this has meant asking a fellow judge to reconsider their selection for the 'greater good'.
Some images however, are selected without careful attention to the details, and again, the judge has been asked to reconsider (I too, was guilty of that once).
It's very important imo that the new judging team remain in contact and communicate with one another. Sometimes when a IOTD has been selected and questioned by another judge, the original judge goes off line for an indefinite period and the IOTD stands as is, warts and all, as only the original judge can change it once it's selected.
Although individual images are selected in isolation, remember that IOTD judges are effectively working as a team, and Astrobin is stronger for it
# 20 Feb, 2019 10:52
Thats the point.
Everyone can make a mistake or overlook something.
But as a team, the others should point out to correct the error in time.
I supose we are all old enough to endure it, if there is criticism for the selection, and to correct as far as the criticism is justified.
# 20 Feb, 2019 15:53
I remember that is was not only the same subject but also the same author ! I think it was the combination of both factors which upset the members…
By the way, I am not sure that members were really upset, but needed to be reassured that it was indeed an error, and not something meant to be renewed frequently…
But mistakes happens… that's it. Not a big deal from my point of view. There has been very little controversy so far… judges did a great job !
# 21 Feb, 2019 19:31
Everything that Gary said. I'd rather my own image be skipped over if that target had just been selected. There are countless targets in the sky - and some have a built-in wow factor or are easier to photograph (Tolga's example of the Rosette is a perfect candidate for both). These are aspects that could overshadow the technical accomplishments made on some lesser-known targets and very deserving images.|
As for the IOTD cue and a judge 'going dark' - could a system be implemented whereby another judge places a hold on an image if there are serious concerns?
# 21 Feb, 2019 21:43
|I think we are in agreement for the most part. As a judge I don't have access to Hubble images. I have 10 images (could be more number is not the point) I can pick from that came from the submitters. My question is why should the second image of the same target has to be better than the first image of the target. I am not comparing the two. I am comparing the 10 that are in front of me. If I ignore the image because it was selected 2 days ago, I am then in a position to select perhaps not so good image. I get this question a lot from people. They ask me why was this target selected. Look at mine from a month ago it's better. As judges we are not comparing the current image to your image, we are comparing what is available to us in the queue. So my second question is should I pick the best of the 10 that are available in the queue or pick images based on other criteria. It seems very unfair to me if you were a day late posting a target which should have won but got skipped over because someone else beat you to it. It could even be worse, Person A's image could be sent up to the judges by submitters first before person B's image even though person B posted it first. We select the first image and person B is now skipped over. These are not hypothetical. These do happen. This is why I am suggesting we pick the best of what is available without looking at other factors.|
# 21 Feb, 2019 22:00
|There's a lot in what you just said that I wasn't aware of before, Tolga. I actually didn't understand that it worked like that. Regardless: It seems like the underlying issue you're having is with how the queue works. Perhaps that's where changes could be made? I do agree that images shouldn't be passed over *solely* because of bad timing. But I still think that judges should seek to highlight a variety of targets in their IOTD selections.|
# 22 Feb, 2019 04:38
As a new submitter, I try and look for the best image - even if that means that I am selecting the same target 3 different times on the same day. To me, a good image is a good image, regardless of whether it has been accorded IOTD in the past week or not.|
If the intention is to really select different targets on successive days, it might be worth publishing those "requirements", even if those are mere guidelines (unless they are posted somewhere, in which case, maybe worth reiterating where to find them - I certainly have no clue if they are published and I have been using this site for a while now).
Next, I am also a bit surprised that there is emphasis on "own equipment" - while I am not looking to argue against it, there are folks who are part of imaging teams and I think it is a bit of dis-service to the image collecting team to not consider those images (although, I am sure I have seen IOTDs from folks who have been part of an imaging team, so not sure how stringent this criteria is). An option might be to have different categories. I think that idea has been floated before, may be it is time to re-consider it.
# 22 Feb, 2019 08:12
first I would like to thank you to bring that topic to an open discussion and for sharing your thoughts with us.
I support Chris' point of view, but without strong emphasis. I am one of the "own equipment" guys, but - since we do not have categories here - I strongly support the idea, that every source of data, as long as declared correctly, should be considered for IODT.
However, there is something, that makes me feel much more sort of uneasy: Sometimes I suspect that nobody (neither submitter, nor reviewer, nor judge) had ever a look at full resolution of the IOTD. Here is an 100% crop out of an (not very much recent) IODT:
Just wondering …
# 22 Feb, 2019 10:00
I've seen this too and also wondered.
But shit happens.
Thats why I'm thinking it is important, if some judge elects an obviosly inappropriate picture, the others should raise an objection before it is too late.
# 22 Feb, 2019 10:16
FritzI do. Sometimes stars doesn't look pretty at widefield landscape panoramas. I cannot comment on the example you present since I can't see the entire image. And I don't have the rest of the reviewing queue images. There are occasions when I can't find any interesting images at the queue and I choose to not vote at all.
# 22 Feb, 2019 10:28
thank you for replying to my post. Please understand, that I have no intention of fingerpointing a particular image in a public discussion, so I would like to keep this random crop sort of anonymous. I would rather not see folks pixelpicking on my images in public.
# 22 Feb, 2019 10:47
I am with you regarding anonymity. That's the reason I am using this odd nickname
In some image batches there is nothing interesting for me to pick. If others (and this is a guess, I don't know how other reviewers judge) don't pick anything, maybe the Judges are left with very poor images (sorry) to judge. And they have to pick something anyway. This is not their fault; it's just a period of time with no good pictures. Such periods exist and anyone can confirm their existence by browsing the Recent Images stream.
# 22 Feb, 2019 10:58
|Do we have some specific standards for all IOTD staff to help them choose images？|
# 22 Feb, 2019 14:22
Fritz raises an interesting question, one which I debate often in my head (not so much related to IOTD, but to posting in general):|
How important is the full resolution view of an image? And related to this, how many Astrobin users take the time to look at an image in full resolution?
I primarily use my images as screen backgrounds or other applications where the viewer is usually only viewing the full image (not zoomed in). If the object fills the whole frame, then this issue is irrelevant. But if the object is small scale and only partially fills the FOV, I often crop the image, sometimes tightly, to highlight the details of the image. Of course, when that happens, a full resolution investigation of the cropped image will show more flaws on the stars and the background.
A good example is Fritz' recent wonderful IOTD of Thor's Helmet. It is a deserving image, and one which looks great at full resolution. But at normal resolution, Thor's Helmet only takes up about 1/6 of the width. So viewed at normal resolution, the object itself is not as striking and the finer details are harder to see.
# 22 Feb, 2019 19:21
Gary, as a new submitter, I will say it's easy to miss a small object in a wide field. Sometimes the wide field is an important part of the composition, sometimes it's best to crop it.
If I think the image might be worthy of submitting I look at full resolution every time. Our queue is "two-up" so there are several images on the screen at a time. I also notice that the colors seem more muted in the initial view and become more vivid on full screen.
# 22 Feb, 2019 20:39
Thanks, Kevin, that is good to know.
# 22 Feb, 2019 21:25
Check out these ideas from Andy, who has served as an IOTD judge: https://www.astrobin.com/forum/c/astrobin/generic-discussions/a-few-thoughts-incoming-iotd-staff-on-image-assessment/
I think it is well-reasoned, and covers a lot of what I am looking for as a submitter.
If you look back at the discussion forum over the past few years, there are many discussions on this topic. I think it is basically universally agreed that IOTD staff should be looking at images they select at 'full resolution', as well as zoomed out and also look at the 'technical card' and 'description' fields. This is very easy and pretty fast to do with the system that Salavatore set up for IOTD staff. What is NOT agreed upon is how much we should weigh 'degree of difficulty' vs 'novelty' vs 'processing skills' vs 'pretty picture' vs 'technically perfect' etc. when making choices as Submitters/Reviewers/Judges. Personally, I think that is fine, and since it is a human process, I would hope the Astrobin community would understand that not every Top Pick or IOTD needs to be universally agreed upon. And that very occasionally, there may be "mistakes".
# 22 Feb, 2019 21:38
Well said, Nico!
As an engineer, I love to over-quantify things. My wife will tell you crazy stories associated with that. But I agree that, in this case, we should rely on the judgment and experience of the Submitters/Reviewers/Judges. Having specific weightings or standards could be a cumbersome process and a burden on those folks who are volunteering their time to make the IOTD process happen.
|You have no new notifications.|