On 2/17/12 5:58 PM, Stan Hoeppner wrote:
> On 2/16/2012 8:49 PM, Dave Chinner wrote:
>> On Thu, Feb 16, 2012 at 12:50:57PM +0100, Assarsson, Emil wrote:
>>> Hi,
>>>
>>> Are there any recommendations about how much memory I need based
>>> on the size of the file system and/or amount of files? For
>>> example: how much memory would be optimal for a 20TB file system
>>> with 3000000 files?
>>
>> /me shrugs
>>
>>> I guess it depends on the usage pattern?
>>
>> Totally.
>
> Allow me to drag the OP's question under a different light...
>
> I have a 20TB XFS filesystem with 3000000 files. What is the minimum
> amount of system RAM I will need to run an xfs_check or xfs_repair or
> xfs_[tool] on this filesystem, assuming Linux has been put into a low
> overhead state, and said tool[s] has access to the bulk of the system
> memory?
>
> Is there a formula available so any XFS user can calculate this
> xfs_[tools] RAM requirement, given FS size X and file count Y?
>
http://xfs.org/index.php/XFS_FAQ#Q:_Which_factors_influence_the_memory_usage_of_xfs_repair.3F
-Eric
|