[Rtk-users] Implementing a Motion-Compensated SART (MCSART) Reconstruction
Simon Rit
simon.rit at creatis.insa-lyon.fr
Thu Apr 24 14:59:15 CEST 2025
Thanks for sharing, good to hear. We'll try to implement these forward and
backprojectors, that should not be hard.
If I/O is really the issue, I think there is a way to use the ITK python
packages in Matlab. I can dig in my archives if it's useful.
Simon
On Wed, Apr 23, 2025 at 11:08 PM Andosca, Ryan <RAndosca at mednet.ucla.edu>
wrote:
> Hello,
>
> That worked! I implemented a version of the offset detector filter based
> on the paper and that eliminated the artifact. So I now have a working
> version of MCSART, which is exciting! Big thanks for that!
>
> My next step is to try and speed it up - right now, a two iteration MCSART
> using my MATLAB implementation takes about 8 hours, which is pretty steep
> for what I need to use it for. I currently have a lot of time wasted on I/O
> stuff because I'm using the command line applications to perform operations
> one projection at a time, so my thought to speed it up is to at least use
> the Python bindings to eliminate that aspect of it. I think the ideal
> solution would be to write my own filter that implements deformation before
> forward projection and after back projection and otherwise is identical to
> the SART filter, but my talents are C are nonexistent as I've mentioned, so
> that may be beyond me.
>
> In any case I appreciate all of your help, and I'll keep an eye on the
> issue that you linked in your first message!
>
> Best,
> Ryan
> ------------------------------
> *From:* Simon Rit <simon.rit at creatis.insa-lyon.fr>
> *Sent:* Friday, April 18, 2025 12:10 AM
> *To:* Andosca, Ryan <RAndosca at mednet.ucla.edu>
> *Cc:* rtk-users at openrtk.org <rtk-users at openrtk.org>
> *Subject:* Re: [Rtk-users] Implementing a Motion-Compensated SART
> (MCSART) Reconstruction
>
> Hi, Yes, everything you said in your first email is correct. But you can
> replace "forward projection of ones (using your Joseph forward projector)"
> by the rtk: : RayBoxIntersectionImageFilter as implemented in RTK to make
> things faster.
> Hi,
> Yes, everything you said in your first email is correct. But you can
> replace "forward projection of ones (using your Joseph forward
> projector)" by the rtk::RayBoxIntersectionImageFilter
> <https://urldefense.com/v3/__https://www.openrtk.org/Doxygen/classrtk_1_1SARTConeBeamReconstructionFilter.html__;!!F9wkZZsI-LA!AVyQgRW58wAlvZ3ykNBWhhVAiSuqmRt6LfCWJPAL2RwqZKKXBYldNCVx5I4-AlHHTSiy3rYM0FToqXxgg8rHYIwb6FK4t-S2Dw$>
> as implemented in RTK to make things faster.
> The additional blurring in your images is most likely due to the
> intermediate interpolation steps for warping.
> Are your projection images truncated? If yes, you may want to add the
> offset detector image filter as in the application to avoid the artifact
> you mention:
>
> https://github.com/RTKConsortium/RTK/blob/master/include/rtkSARTConeBeamReconstructionFilter.hxx#L44
> <https://urldefense.com/v3/__https://github.com/RTKConsortium/RTK/blob/master/include/rtkSARTConeBeamReconstructionFilter.hxx*L44__;Iw!!F9wkZZsI-LA!AVyQgRW58wAlvZ3ykNBWhhVAiSuqmRt6LfCWJPAL2RwqZKKXBYldNCVx5I4-AlHHTSiy3rYM0FToqXxgg8rHYIwb6FKj25nF-A$>
> This has been described in this paper:
> https://doi.org/10.1088/0031-9155/58/2/205
> <https://urldefense.com/v3/__https://doi.org/10.1088/0031-9155/58/2/205__;!!F9wkZZsI-LA!AVyQgRW58wAlvZ3ykNBWhhVAiSuqmRt6LfCWJPAL2RwqZKKXBYldNCVx5I4-AlHHTSiy3rYM0FToqXxgg8rHYIwb6FIFUWHkdQ$>
> <https://urldefense.com/v3/__https://grenoble.alma.exlibrisgroup.com/view/action/uresolver.do?operation=resolveService&package_service_id=20495761340006161&institutionId=6161&customerId=6160__;!!F9wkZZsI-LA!AVyQgRW58wAlvZ3ykNBWhhVAiSuqmRt6LfCWJPAL2RwqZKKXBYldNCVx5I4-AlHHTSiy3rYM0FToqXxgg8rHYIwb6FKGFykxLg$>
> Simon
>
> On Thu, Apr 17, 2025 at 10:48 PM Andosca, Ryan <RAndosca at mednet.ucla.edu>
> wrote:
>
> As an update, I swapped to the voxel-based back projection (from
> CudaRaycast) and normalized in just one step by dividing the forward
> projection by a forward projection of ones. The image quality did improve a
> little, but the main artifact of concern is still present. To describe the
> artifact better, it's a bright cylinder near the center of the patient. I'm
> now wondering if this may be a geometry error of some kind rather than
> normalization... No such artifact exists when I use the rtksart
> application, which uses the same geometry file I use in my MCSART MATLAB
> implementation. But there are other areas where geometry artifacts might
> originate, like improperly updating file headers etc...
>
> Since my coding is very clunky, my code involves a lot of saving out
> variables, sending them to RTK applications, then loading the results back
> into MATLAB for deformation. Not exactly streamlined, I know. But perhaps
> some header information isn't being sent to the RTK applications that
> should be at some point, or the header information is off slightly
> somewhere. I'll attach an image of the artifact in question to this email,
> though I'm not sure it will appear on the message board.
>
> In the images, left is the rtksart output and right is my MCSART result
> (after one iteration). Also notice that the SART image appears sharper than
> MCSART - I can't be sure if that's a function of noise or what, but that
> difference may also speak to the issue for all I know... That diaphragm
> sure gets sharper though, so motion compensation is at least doing what it
> should be doing.
>
>
>
>
> ------------------------------
> *From:* Andosca, Ryan <RAndosca at mednet.ucla.edu>
> *Sent:* Thursday, April 17, 2025 8:55 AM
> *To:* Simon Rit <simon.rit at creatis.insa-lyon.fr>
> *Cc:* rtk-users at openrtk.org <rtk-users at openrtk.org>
> *Subject:* Re: [Rtk-users] Implementing a Motion-Compensated SART
> (MCSART) Reconstruction
>
> Hi Simon,
>
> Thank you so much for your reply! If I understand correctly, I just need
> to divide the forward projection step by a forward projection of ones
> (using your Joseph forward projector, which I do), and if I use voxel-wise
> back projection I needn't apply a secondary normalization to the back
> projection step? Is there a downside, maybe computationally, to this
> implementation? And if I do use Joseph back projection, do I normalize by
> dividing by a Joseph back projection of ones through the current
> projection's geometry? Apologies for the flurry of questions, I'm quite new
> to this and doing my best to learn as I go!
>
> If you are able to implement a command line application that utilizes the
> warp projectors, that would be simply amazing for me! My Frankenstein
> implementation is a bit of a mess, after all.
>
> I greatly appreciate all of your help!
>
> Best,
> Ryan
> ------------------------------
> *From:* Simon Rit <simon.rit at creatis.insa-lyon.fr>
> *Sent:* Thursday, April 17, 2025 7:48 AM
> *To:* Andosca, Ryan <RAndosca at mednet.ucla.edu>
> *Cc:* rtk-users at openrtk.org <rtk-users at openrtk.org>
> *Subject:* Re: [Rtk-users] Implementing a Motion-Compensated SART
> (MCSART) Reconstruction
>
> Caution: External sender from outside our organization.
> Proceed with caution with regard to links and attachments.
> Report Suspicious
> <https://us-phishalarm-ewt.proofpoint.com/EWT/v1/F9wkZZsI-LA!NQZKB92JqBjHLcZIRWJ0aZ3KEw4s7AbRCLACL3c0LXgq9ucGal0mGm_Nc6Tppel1CJJe9obSKsZwpIAmMViSGzrzmowa9MR7iDOyuhrkh8XdZNx1RQuFT0lZslJ50k3E$>
>
> Hi Ryan,
> If you check our code, the normalization step is quite simplistic to avoid
> a backprojection: we simply calculate and divide by the length of the
> intersection between each ray and the "image box" (parallelepiped
> corresponding to the space occupied by the volume) using the
> rtk::RayBoxIntersectionImageFilter
> <https://urldefense.com/v3/__https://www.openrtk.org/Doxygen/classrtk_1_1SARTConeBeamReconstructionFilter.html__;!!F9wkZZsI-LA!GszYXB332VHa24rMYhbTodaEcyBkB3au3CfqoEhD766UVgPR9YWIjlUO-SvPdKa53H0Kor1LC_0Jw95zlWeI69YbY2D0kEAHUA$>.
> This is exactly what you should get if you forward project a volume of 1
> with our Joseph forward projectors. See the diagram from the implementation
> of SART:
>
> https://www.openrtk.org/Doxygen/classrtk_1_1SARTConeBeamReconstructionFilter.html
> <https://urldefense.com/v3/__https://www.openrtk.org/Doxygen/classrtk_1_1SARTConeBeamReconstructionFilter.html__;!!F9wkZZsI-LA!GszYXB332VHa24rMYhbTodaEcyBkB3au3CfqoEhD766UVgPR9YWIjlUO-SvPdKa53H0Kor1LC_0Jw95zlWeI69YbY2D0kEAHUA$>
> Maybe you can try to do the same here?
> For what you want to do, it would be better to use a forward and
> backprojector combining the projection and a vector field... We have it
> available for the backprojection
> <https://urldefense.com/v3/__https://www.openrtk.org/Doxygen/classrtk_1_1FDKWarpBackProjectionImageFilter.html__;!!F9wkZZsI-LA!GszYXB332VHa24rMYhbTodaEcyBkB3au3CfqoEhD766UVgPR9YWIjlUO-SvPdKa53H0Kor1LC_0Jw95zlWeI69YbY2DuZTwQnQ$> and
> for the forward
> <https://urldefense.com/v3/__https://www.openrtk.org/Doxygen/classrtk_1_1ForwardWarpImageFilter.html__;!!F9wkZZsI-LA!GszYXB332VHa24rMYhbTodaEcyBkB3au3CfqoEhD766UVgPR9YWIjlUO-SvPdKa53H0Kor1LC_0Jw95zlWeI69YbY2Dhk08GsA$>,
> maybe we should try to implement it in the SART command line application
> for you... I have opened an issue to track this:
> https://github.com/RTKConsortium/RTK/issues/721
> <https://urldefense.com/v3/__https://github.com/RTKConsortium/RTK/issues/721__;!!F9wkZZsI-LA!GszYXB332VHa24rMYhbTodaEcyBkB3au3CfqoEhD766UVgPR9YWIjlUO-SvPdKa53H0Kor1LC_0Jw95zlWeI69YbY2AlXwVyeg$>
> Cheers,
> Simon
>
>
> On Mon, Apr 14, 2025 at 6:25 PM Andosca, Ryan <RAndosca at mednet.ucla.edu>
> wrote:
>
> Hi,
>
> So, my experience with coding, especially in C, is extremely limited. I
> mostly work in MATLAB, and as such I've been primarily using the built-in
> console applications to perform reconstruction on CBCT datasets. The
> rtksart application works great and provides a beautiful reconstruction. My
> goal is to effectively do the same reconstruction, but tack on motion
> compensation to each iteration.
>
> After delving into the algorithm implementation in rtksart as much as I am
> able, I wrote a MATLAB workflow that boils down to:
>
> - Start with image of zeros
> - Randomize gantry angle order
> - Iterate over gantry angles:
> - Load current projection forward and backward motion compensation DVFs
> - Deform current image iteration to current gantry angle breathing
> state
> - Forward project deformed image
> - Subtract deformed projections from raw CBCT image data
> - Back project the subtraction image
> - NORMALIZE BACK PROJECTION (this is where I think my issue is)
> - Deform normalized back projection back to the original image space
> - Update current image: img = img + lambda *
> (normalized_back_projection / number_of_gantry_angles)
> - Note: This step, as it also contains normalization, may also be a
> source of my issue...
> - Set negatives to 0
> - Repeat
>
>
> I get an image that is much as I expect - a much sharper diaphragm and
> generally higher definition in soft tissue structures. So I must be
> generally on the right track. The issue is that the normalization is
> clearly off in some way. I have a bright circle artifact in the middle of
> the image and some striping throughout. Try as I might to understand it, I
> couldn't discern how normalization was done in the rtksart application. So,
> this is the method I used:
>
> - Create an image of 1s of the size of the final (SART reconstructed)
> image.
> - Forward project through this image of 1s
> - Back project through the resulting forward projection
> - In the normalization step in the MCSART process detailed above, I
> simply divide by this resulting back projection.
>
>
> Knowing all of that, any idea what my issue might be leading to my
> artifacts? Ultimately, I know it would be more ideal if I just altered the
> rtksart application itself to include motion compensation, and then I'd
> just be using the built-in normalization that clearly works, but I find my
> abilities lacking when I attempt to do this... So instead I am just seeking
> to understand how that application implements normalization, so that I can
> use a similar normalization in my algorithm!
>
> I greatly appreciate any help provided!
>
> Best,
> R. Andosca
> GSR
> UCLA Health
>
>
>
>
>
> ------------------------------
>
> UCLA HEALTH SCIENCES IMPORTANT WARNING: This email (and any attachments)
> is only intended for the use of the person or entity to which it is
> addressed, and may contain information that is privileged and confidential.
> You, the recipient, are obligated to maintain it in a safe, secure and
> confidential manner. Unauthorized redisclosure or failure to maintain
> confidentiality may subject you to federal and state penalties. If you are
> not the intended recipient, please immediately notify us by return email,
> and delete this message from your computer.
> _______________________________________________
> Rtk-users mailing list
> rtk-users at openrtk.org
> https://www.creatis.insa-lyon.fr/mailman/listinfo/rtk-users
> <https://urldefense.com/v3/__https://www.creatis.insa-lyon.fr/mailman/listinfo/rtk-users__;!!F9wkZZsI-LA!GszYXB332VHa24rMYhbTodaEcyBkB3au3CfqoEhD766UVgPR9YWIjlUO-SvPdKa53H0Kor1LC_0Jw95zlWeI69YbY2CRTS8UpQ$>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.creatis.insa-lyon.fr/pipermail/rtk-users/attachments/20250424/01389b74/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 286605 bytes
Desc: not available
URL: <http://www.creatis.insa-lyon.fr/pipermail/rtk-users/attachments/20250424/01389b74/attachment-0002.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 428731 bytes
Desc: not available
URL: <http://www.creatis.insa-lyon.fr/pipermail/rtk-users/attachments/20250424/01389b74/attachment-0003.png>
More information about the Rtk-users
mailing list