Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/causes-of-statistics-on-fab-tool-processing-time-uncertainty.15334/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

Causes of / statistics on fab tool processing time uncertainty?

jms_embedded

Active member
Throughput vs cycle time curves have a particular characteristic, e.g. you can get cycle times down to minimum if the fab is very lightly loaded and there's not much throughput (which no one wants) and they go through the roof if the fab is very heavily loaded (which no one wants). This kind of behavior depends on variability for the individual steps, however.

Some of this variability is due to deliberate scheduling (e.g. scheduled equipment downtime, or unequal processing times because of a diverse mix of products) and I assume that scheduling algorithms can help minimize the impact on overall cycle times.

What about the uncertainty? Are there well-documented causes / statistics on what leads to variation in processing times that cannot be anticipated?
Aside from unplanned maintenance (there's E10 unscheduled down time, but not surprisingly there doesn't seem to be published statistics), what causes a particular tool to vary in the time it takes to complete a step? I imagine most of the steps as having a predetermined time to complete lithography / etch / etc.

Are there any statistics published somewhere?
 
Back
Top