# 02 Sep, 2019 00:28
My 2 cents on a subject that I’ve seen popping up almost monthly basis. There is no doubt that my images don’t come close to any of the big players. I hope one day I’ll be there, but obviously there are many factors to play into that like experience, equipment used, time, and being smarter than me at it.|
What i think might be a good idea is a second tier IOTD, where people with less experience… cheaper equipment can compete. Also having multiple categories like backyard imaging, remote observ, sourced data, traveler… etc. I know that’ll take more work from you guys doing this in your spare time, but I think it’ll be a great addition to otherwise an excellent site.
Thank you keep up the good work!
# 02 Sep, 2019 02:29
Having been an IOTD judge last year, I thought I'd once again share my thoughts on image assessment.
This was just my approach and for anyone considering coming onboard the team, it may prove useful.
I have been judging Professional Photography awards for the past 25 years consecutively, and have had 6 of my images featured by Nasa/APOD.
Firstly an image needs to have impact!-Have I seen that object before?
If so have I seen it treated/processed like this before?
Is it well composed or has the photographer merely stuck it in the middle of the frame?
Have they considered the objects' context relative to it's surroundings?
If NB, is the colour palette harmonious and pleasing (or in the case of LRGB, accurate)?
Assuming all these criteria have been handled favourably, I then look closely at the technical aspects.ie: detail, sharpness, noise, treatment of stars, (round stars, star colour & density), resolution, dynamic range etc.
Finally I consider degree of difficulty, eg: if there were two very similar high quality images of the same target that passed all the above assessment criteria and one was from a backyard and the other from remote data, I'd likely lean in favour of the backyard image.
(apologies to remote imagers, I'm not starting a flame war here!)
That being said, I have previously awarded IOTD's to outstanding remote images, not because they are technically perfect, (that's mandatory in my opinion for images from remote data) but because they are impactful, original & beautifully processed!
I have also overlooked some remote images that whilst being technically perfect - lacked impact, originality, composition & aesthetics.
There are many reasons an image is awarded IOTD, but these are the ones I generally considered.
Other judges with experience may take a different approach of course!
You might also find Nico Carver's "Guidelines for IOTD Staff" document useful… https://docs.google.com/document/d/1fZSRQv6Wu8AHVqT4Ymer560XaS4bFbaJVGmONiEToX4/edit?usp=sharing
Hope that helps!
# 04 Sep, 2019 03:20
I like to get likes and wish I got more likes (never mind an IOTD), but also realize that there are a bunch of reasons – both ones I control and ones I don't – why I don't. There are other things about AstroBin that are valuable enough to make it worthwhile for me: plate-solving, a place to organize and archive my capture data, a place to research new targets, a place where I can be inspired by others' work.|
All that said, I think AstroBin could be a more socially welcoming site if it intentionally invested in helping photographers grow. In the 3-4 years I've been posting pictures here, I've gotten one piece of unsolicited constructive advice. Otherwise, I've either PM'd folks directly for advice or have just had to draw my own conclusions. Likes and views are nice, but what I'd really value is feedback on what I'm doing well and where I can improve (preferably with a concrete suggestion or two to try).
Thinking about it today, two ideas I had were "critics" and "mentors". Both are voluntary roles. A critic signs up to give anonymous feedback – 2-3 things the imager did well, and 2-3 things they might try as improvements – for (say) 5 images a week, for (say) a three month stint. Receiving a critic's feedback is voluntary as well. Mentors aren't anonymous, are matched with specific mentees, and volunteer to give more detailed feedback on the mentee's images, answer questions, etc., for (say) a three month stint. Either way, if you want feedback, there is a way for you to get it. And if you want to give feedback, you can without committing yourself to review hundreds or thousands of images, or for an excessively long time.
I realize implementing either proposal would be non-trivial. But I'd certainly sign up as a critic and to receive criticism. I think that the feedback could be enormously valuable and would ground AstroBin's social aspects on what I suspect is a broadly shared interest: the desire to improve and the desire to be recognized.
[edit:camelcase is hard]
# 04 Sep, 2019 05:33
I think that’s a constructive suggestion. I would personally like both to give and get advice without the fear of it being taken the wrong way.
# 04 Sep, 2019 06:06
|Joel, that's a nice idea! I will keep it in mind for the future!|
# 04 Sep, 2019 09:27
Thanks Connor !
I didn't read this thread before so I didn't notice I was mentioned…
Well, VDB141 is a fantastic and very photogenic nebula ; not really surprising that good pictures of it receive TP/IOTD among average of many other objects…
But astrophoto is a "seasonal" activity ; right now we are going to enter in the "M31 era" ; and then it will be M42, etc…
As a reviewer, I have to confess that it can be sometime boring to see in the selection 3, 4, 5 or more images of the same object… In such case, I tend to promote the best image of this object and to keep my 2 other reviews to push images of other objects, in order to ensure a certain variety of TP images.
But judges, as reviewers, are obviously dependent of submitter's choices. And submitter's choices are not concerted ; so it can lead in some occasions to an overrepresentation of some (popular) objects. It's then the role of reviewers and judges to correct those occasionnal tendencies…
From my point of view, I believe that there is a "bonus" for the very first images of these seasonal popular objects : the first (good) images of those object can more easily receive an IOTD/TP than the later images ; and at the end of the season, only exceptional images are rewarded. I think explanation is quite simple : there is a kind of tiredness os submiters/reviewers and judges to see many similar images during weeks… and meanwhile their level of requirement becomes higher and higher on these objects.
That's probably what allowed me to receive an IOTD for my image of VDB141 ; despite my image is objectively quite good I believe.
Speaking about the role of "trends correction" of reviewers and judges… sometimes it leads to some excess (we reviewers have to be self-critical too ) : in some occasions, I noted that images which were "pushed" by 5 or 6 submiters were not then pushed by any reviewer ! Understandable when images are "pushed" by only one submiter, but hard to understand when it comes to such a level (very few images are pushed by so many submiters). You can see it as a lack of consideration of submiter's choice ; but in the other way, it demonstrate in my opinion that reviewers and judges'choices are not driven or influenced by popularity of photographers.
|You have no new notifications.|