hello,
we're using the software for mtf measurement on an SFRplus chart provided by IMATEST. we would need to be able to discriminate between horizontal and vertical edges but we are having a hard time in understanding how to do it from the output data and if it is even possible. Basically we're looking at the "slanted edge orientation" angle and the "edge orientation relative to radial line to centre" angle, as they are called in the output file "edge_sfr_values.txt". about the first one: as far as we understand it is given after a modulo45 operation and without specifying wether it's computation is done with respect to the vertical or horizontal axis and it then results to be not useful to our scope. about the second one: it is not provided how the normal to the edge is computed and then which angle we're being provided. on top of this the angle convention seems to change from one quadrant to an other, making the understanding of the angle computation process very difficult.
can you help us indicating how and in which part of the code these angles are calculated, or if it is possible to extract from the code some other data which we could use in order to easily identify almost horizontal and vertical edges.
it would be much appreciated.
Regards,
Lorenzo
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
p.s. we are aware that your code is able to discriminate between sagittal and meridional edges, but also on that we could not understand how it is done. but it could be added to the discussion since honestly, it could possibly be even more useful than horizontal and vertical discrimination.
we're providing here one of our test images on which we're trying to compute the mtf. to explain the problem we found that our optical system has some different astigmatism in the two directions. we then use a model which provides sagittal and meridional OGSE mtf which should be applied consistently to the edge mtf measurements. this is the core issue.
It is a bit clunky, but the quickest way to obtain the geometric edge orientation is to compute the vector between the "edge centroid" and the "nearby corner" coordinates listed in edge_mtf_values.txt (assuming you ran MTF Mapper with the -q -v 2 options).
The non-comment lines in edge_mtf_values.txt and edge_sfr_values.txt match one-to-one, so you can safely join the information across the two files. Then take x=$2 - $5 and y=$3 - $6, where $n represents column number n from edge_mtf_values.txt, to be the x and y components of your edge direction vector. I've plotted these vectors as a gnuplot vector field to illustrate (see attachment).
Another way is to parse the serialized_edge.bin file as shown in the read_edge_info() method of process.py, and use the it like this:
header, edges = read_edge_info("out/serialized_edges.bin")
for edge in edges:
print(f"angle = {edge['angle']/np.pi*180:.1f}")
but do keep in mind that this binary file format is not public, i.e., I might still add new fields (but I will update process.py as needed to keep up).
we found ourselves coming back to this issue once again. We are now using rectangular and square slits in front of a light source to generate strong edges (four edges) in the acquisition for MTF computation. Please in the attached images two screenshots of typical acquisition for MTF measurement using these two targes.
With the rectangular slit, horizontal and vertical edges can be discriminated by their length, but this is no longer possible with the square target. For this reason, we tried to employ the same method you mentioned in your first reply, considering the nearby corner, but we encountered a strange issue: in every edge_mtf_values.txt file generated by an analysis the nearby corner coordinates are set to zero. The only explanations that come to mind are related to the nature of the targets, but we could not find any clear justification.
This led us to employ the second method you referred to, using the serialized_edge.bin file and the read_edge_info function. Although this worked, and we were able to find an edge inclination angle for each edge contained in the image, we are not sure how these angles should be interpreted. For example, for the square target, four edges are successfully found and their centroids correctly identified, but the associated angles are as reported in the attached images. I numbered each edge to make it easier to compare the target image with the centroid/angle output.
The main point is: we could not determine the reference axis with respect to which these angles are computed. They seem to correspond to the angular distance from a horizontal axis, considered positive if computed clockwise and negative if counter-clockwise, but even then we cannot fully understand the rationale behind assigning one direction or the other for a given edge. Is there a fixed logic for associating an angle value to an almost vertical or horizontal edge orientation in the image?
hello,
we're using the software for mtf measurement on an SFRplus chart provided by IMATEST. we would need to be able to discriminate between horizontal and vertical edges but we are having a hard time in understanding how to do it from the output data and if it is even possible. Basically we're looking at the "slanted edge orientation" angle and the "edge orientation relative to radial line to centre" angle, as they are called in the output file "edge_sfr_values.txt". about the first one: as far as we understand it is given after a modulo45 operation and without specifying wether it's computation is done with respect to the vertical or horizontal axis and it then results to be not useful to our scope. about the second one: it is not provided how the normal to the edge is computed and then which angle we're being provided. on top of this the angle convention seems to change from one quadrant to an other, making the understanding of the angle computation process very difficult.
can you help us indicating how and in which part of the code these angles are calculated, or if it is possible to extract from the code some other data which we could use in order to easily identify almost horizontal and vertical edges.
it would be much appreciated.
Regards,
Lorenzo
p.s. we are aware that your code is able to discriminate between sagittal and meridional edges, but also on that we could not understand how it is done. but it could be added to the discussion since honestly, it could possibly be even more useful than horizontal and vertical discrimination.
we're providing here one of our test images on which we're trying to compute the mtf. to explain the problem we found that our optical system has some different astigmatism in the two directions. we then use a model which provides sagittal and meridional OGSE mtf which should be applied consistently to the edge mtf measurements. this is the core issue.
thanks again in advance.
Lorenzo.
Hi Lorenzo,
It is a bit clunky, but the quickest way to obtain the geometric edge orientation is to compute the vector between the "edge centroid" and the "nearby corner" coordinates listed in
edge_mtf_values.txt(assuming you ran MTF Mapper with the-q -v 2options).The non-comment lines in
edge_mtf_values.txtandedge_sfr_values.txtmatch one-to-one, so you can safely join the information across the two files. Then take x=$2 - $5 and y=$3 - $6, where $n represents column number n fromedge_mtf_values.txt, to be the x and y components of your edge direction vector. I've plotted these vectors as a gnuplot vector field to illustrate (see attachment).Another way is to parse the
serialized_edge.binfile as shown in the read_edge_info() method of process.py, and use the it like this:but do keep in mind that this binary file format is not public, i.e., I might still add new fields (but I will update process.py as needed to keep up).
Let me know if you still have questions!
Regards,
Frans
Hi Franz, we really missed the additional data on the edge corner in the edge_mtf file. it's straight forward this way.
thanks a lot,
Lorenzo.
Hi Frans,
we found ourselves coming back to this issue once again. We are now using rectangular and square slits in front of a light source to generate strong edges (four edges) in the acquisition for MTF computation. Please in the attached images two screenshots of typical acquisition for MTF measurement using these two targes.
With the rectangular slit, horizontal and vertical edges can be discriminated by their length, but this is no longer possible with the square target. For this reason, we tried to employ the same method you mentioned in your first reply, considering the nearby corner, but we encountered a strange issue: in every edge_mtf_values.txt file generated by an analysis the nearby corner coordinates are set to zero. The only explanations that come to mind are related to the nature of the targets, but we could not find any clear justification.
This led us to employ the second method you referred to, using the serialized_edge.bin file and the read_edge_info function. Although this worked, and we were able to find an edge inclination angle for each edge contained in the image, we are not sure how these angles should be interpreted. For example, for the square target, four edges are successfully found and their centroids correctly identified, but the associated angles are as reported in the attached images. I numbered each edge to make it easier to compare the target image with the centroid/angle output.
The main point is: we could not determine the reference axis with respect to which these angles are computed. They seem to correspond to the angular distance from a horizontal axis, considered positive if computed clockwise and negative if counter-clockwise, but even then we cannot fully understand the rationale behind assigning one direction or the other for a given edge. Is there a fixed logic for associating an angle value to an almost vertical or horizontal edge orientation in the image?
Thank you again for your support,
Lorenzo