HEX
Server: Apache
System: Linux server2.voipitup.com.au 4.18.0-553.109.1.lve.el8.x86_64 #1 SMP Thu Mar 5 20:23:46 UTC 2026 x86_64
User: posscale (1027)
PHP: 8.2.30
Disabled: exec,passthru,shell_exec,system
Upload Files
File: //opt/saltstack/salt/lib/python3.10/site-packages/salt/fileserver/__pycache__/hgfs.cpython-310.pyc
o

�N�g�{�@s�dZddlZddlZddlZddlZddlZddlZddlZddlZddl	m	Z	ddl
ZddlZddl
ZddlZddlZddlZddlZddlZddlmZddlmZddlmZdZdZzddlZd	ZWneyqd
ZYnwe�e�Z dZ!dZ"d
d�Z#dd�Z$dd�Z%dd�Z&dd�Z'dd�Z(dd�Z)dd�Z*dd�Z+dd �Z,d!d"�Z-d#d$�Z.d%d&�Z/dDd'd(�Z0dDd)d*�Z1d+d,�Z2d-d.�Z3dEd/d0�Z4dFd2d3�Z5d4d5�Z6d6d7�Z7d8d9�Z8d:d;�Z9d<d=�Z:d>d?�Z;d@dA�Z<dBdC�Z=dS)Ga�
Mercurial Fileserver Backend

To enable, add ``hgfs`` to the :conf_master:`fileserver_backend` option in the
Master config file.

.. code-block:: yaml

    fileserver_backend:
      - hgfs

.. note::
    ``hg`` also works here. Prior to the 2018.3.0 release, *only* ``hg`` would
    work.

After enabling this backend, branches, bookmarks, and tags in a remote
mercurial repository are exposed to salt as different environments. This
feature is managed by the :conf_master:`fileserver_backend` option in the salt
master config file.

This fileserver has an additional option :conf_master:`hgfs_branch_method` that
will set the desired branch method. Possible values are: ``branches``,
``bookmarks``, or ``mixed``. If using ``branches`` or ``mixed``, the
``default`` branch will be mapped to ``base``.


.. versionchanged:: 2014.1.0
    The :conf_master:`hgfs_base` master config parameter was added, allowing
    for a branch other than ``default`` to be used for the ``base``
    environment, and allowing for a ``base`` environment to be specified when
    using an :conf_master:`hgfs_branch_method` of ``bookmarks``.


:depends:   - mercurial
            - python bindings for mercurial (``python-hglib``)
�N)�datetime)�DEFAULT_HASH_TYPE��FileserverConfigError)�tagify)�branches�	bookmarks�mixed)�base�
branch_method�
mountpoint�rootTF�hgfs)�hgcCshttdvrdStst�d�dStdtvr"t�dtdt�dStjj�	d�dur2t�d�dStS)	z-
    Only load if mercurial is available
    Zfileserver_backendFzeMercurial fileserver backend is enabled in configuration but could not be loaded, is hglib installed?Zhgfs_branch_methodz6Invalid hgfs_branch_method '%s'. Valid methods are: %srNz2hgfs requested but hg executable is not available.)
�__virtualname__�__opts__�HAS_HG�log�error�VALID_BRANCH_METHODS�salt�utils�path�which�rr�H/opt/saltstack/salt/lib/python3.10/site-packages/salt/fileserver/hgfs.py�__virtual__Qs$��
rcCsdd�|��D�}|S)z5
    Returns all branches for the specified repo
    cS�6g|]}tjj�|d�|dtjj�|d�f�qS�r���rr�stringutils�to_str��.0�xrrr�
<listcomp>q�(��z!_all_branches.<locals>.<listcomp>)r)�reporrrr�
_all_branchesjs�r*c�2z�fdd�t|�D�dWStyYdSw)z9
    Find the requested branch in the specified repo
    c�g|]
}|d�kr|�qS�rrr$��namerrr'}�z_get_branch.<locals>.<listcomp>rF)r*�
IndexError�r)r/rr.r�_get_branchx�
�r3cCsdd�|��dD�}|S)z6
    Returns all bookmarks for the specified repo
    cSrrr!r$rrrr'�r(z"_all_bookmarks.<locals>.<listcomp>r)r)r)rrrr�_all_bookmarks�s
�r5cr+)z;
    Find the requested bookmark in the specified repo
    cr,r-rr$r.rrr'�r0z!_get_bookmark.<locals>.<listcomp>rF)r5r1r2rr.r�
_get_bookmark�r4r6cCsdd�|��D�S)z1
    Returns all tags for the specified repo
    cSsRg|]%}tjj�|d�dkrtjj�|d�|dtjj�|d�|df�qS)r�tiprr �r!r$rrrr'�s��z_all_tags.<locals>.<listcomp>)�tags)r)rrr�	_all_tags�s	�r:cr+)z6
    Find the requested tag in the specified repo
    cr,r-rr$r.rrr'�r0z_get_tag.<locals>.<listcomp>rF)r:r1r2rr.r�_get_tag�r4r;cCs�|dkr|d}||dks|t�vrV|ddkr't|d|�p&t|d|�S|ddkr;t|d|�p:t|d|�S|ddkrVt|d|�pUt|d|�pUt|d|�SdS)z1
    Return ref tuple if ref is in the repo.
    r
rrr)rr	F)�envsr3r;r6r2rrr�_get_ref�s��r=cCsdd�|j|dd�D�}|S)z
    Get manifest for ref
    c
SsLg|]"}tjj�|d�tjj�|d�|d|dtjj�|d�f�qS)rrr r8�r!r$rrrr'�s���z!_get_manifest.<locals>.<listcomp>r)�rev)�manifest)r)�refr@rrr�
_get_manifest�s�
rBcCstd��)zB
    Fatal fileserver configuration issue, raise an exception
    z$Failed to load hg fileserver backendrrrrr�	_failhard�srCc
Cs
tj�tdd�}d}g}i}tD]
}ttd|���||<qtdD�]}}t�|�}t|t	�r�t
t|��}dd�tj
j�||���D�}|sQt�d|�t�|�d	|d	�}	|	tvrkt�d
|	|d�t��t�d}
dd
�|D�D]}t�d||d�t��d}
qt|
r�t�|�|�n|}t|t�s�t�d|�t�z
tj
j�|d�|d<Wn	ty�Ynwttt�dt��}||�d����}tj�||�}
tj�|
�s�t� |
�t�!|
�s�t"�#|
�}|�$�d}d}z�zt"�%|
�}Wn2t"jj&�y
t�d|
|�t�Ynt'�y&}zt�d||�t�WYd}~nd}~wwz|j(dd�}Wnt"jj)�y=d}Ynw|�sutj�|
dd�}tj
j*�+|d��}|�,d�|�,tj
j-�.d|�d���Wd�n	1�spwY|�||||
tj�tdd|�d��d ��|�/|�W|�r�|�$�q#|�r�|�$�ww|�rtj�tdd!�}zCtj
j*�+|d��0}t0�1��2d"�}|�,d#|�d��|D]}|�,tj
j-�.d$�3|d%|d&����q�Wd�n	1�s�wYWnt4�y�Y|Swt�5d'|�|S)(zE
    Return a list of hglib objects for the various hgfs remotes
    �cachedirrFZhgfs_Zhgfs_remotescSsi|]	\}}|t|��qSr)�str)r%�key�valrrr�
<dictcomp>�s��zinit.<locals>.<dictcomp>z�Invalid per-remote configuration for hgfs remote %s. If no per-remote parameters are being specified, there may be a trailing colon after the URL, which should be removed. Check the master configuration file.rzdInvalid branch_method '%s' for remote %s. Valid branch methods are: %s. This remote will be ignored.z, css�|]	}|tvr|VqdS�N)�PER_REMOTE_OVERRIDESr$rrr�	<genexpr>s�zinit.<locals>.<genexpr>z|Invalid configuration parameter '%s' for remote %s. Valid parameters are: %s. See the documentation for further information.TzZInvalid hgfs remote %s. Remotes must be strings, you may need to enclose the URL in quotesr�	hash_typezutf-8Nz�Cache path %s (corresponding remote: %s) exists but is not a valid mercurial repository. You will need to manually delete this directory on the master to continue to use this hgfs remote.z<Exception '%s' encountered while initializing hgfs remote %sspaths)�namesz.hgZhgrc�w+z[paths]
z
default = �
z
.update.lk)r)�url�hashrD�lockfilezhgfs/remote_map.txtz%d %b %Y %H:%M:%S.%fz# hgfs_remote map as of z{} = {}
rQrPzWrote new hgfs_remote map to %s)6�osr�joinrrJrE�copy�deepcopy�
isinstance�dict�next�iterrr�dataZrepack_dictlist�itemsrrrC�getr�updaterPZstrip_proto�	TypeError�getattr�hashlibr�encode�	hexdigest�isdir�makedirs�listdir�hglib�init�close�openZServerError�	ExceptionZconfig�CommandError�files�fopen�writer"r#�appendr�now�strftime�format�OSError�info)�bp_Z
new_remote�reposZper_remote_defaults�param�remoteZ	repo_confZrepo_urlZper_remote_confrZper_remote_errorsrLZ	repo_hashZrp_�clientr)�exc�refsZ
hgconfpathZhgconfigZ
remote_map�fp_�	timestamprrrrh�s

������
���


�
����
�������
������	�rhcCs2tj�tdd�}zt�|�}Wntyg}Ynwt�}|D]}z	|�|d�Wq!ty5Yq!wg}|D]}|dvrAq:tj�||�}tj�	|�rS|�
|�q:g}|r�|D].}zt�|�Wnty�}	zt
�d||	�|�
|�WYd}	~	qZd}	~	wwt
�d|�qZ|D]}
|�|
�q�t|�|fS)zC
    Remove cache directories for remotes no longer configured
    rDrrQ)rQr|z0Unable to remove old hgfs remote cachedir %s: %sNzhgfs removed old cachedir %s)rSrrTrrfrtrh�remove�
ValueErrorrdrp�shutil�rmtreerr�debug�bool)rvZcachedir_lsrwr)�	to_remove�itemr�failed�rdirr{Zfdirrrr�_clear_old_remotes�sF��
���r�cCs�tj�tdd�}tj�tdd�}g}||fD]-}tj�|�rEzt�|�WqtyD}z|�d|�d|���WYd}~qd}~wwq|S)z%
    Completely clear hgfs cache
    rDr�file_lists/hgfszUnable to delete z: N)	rSrrTr�existsr�r�rtrp)Zfsb_cachedir�
list_cachedir�errorsr�r{rrr�clear_cache�s"���r�c	C��dd�}t|t�r
||�Sg}g}t�D]S}zJ|rIzt�|d|�s+WW|d��qWntyHt�|dt|��sFYW|d��qYnw||�\}}|�|�|�|�W|d��q|d��w||fS)z�
    Clear update.lk

    ``remote`` can either be a dictionary containing repo configuration
    information, or a pattern. If the latter, then remotes for which the URL
    matches the pattern will be locked.
    c
Ssdd�}g}g}tj�|d�r�z	t�|d�WnUtyn}zI|jtjkrIz	t�|d�Wn'tyH}z||||�WYd}~nd}~ww||||�WYd}~||fSWYd}~||fSWYd}~||fSd}~wwd�	|d�}t
�|�|�|�||fS)NcSs.d�|d|d|�}t�|�|�|�dS)Nz-Unable to remove update lock for {} ({}): {} rPrR)rsrr�rp)Zerrlistr)r{�msgrrr�
_add_error�s
�
z6clear_lock.<locals>._do_clear_lock.<locals>._add_errorrRzRemoved lock for {}rP)
rSrr�rrt�errnoZEISDIRr�r�rsrr�rp)r)r��successr�r{r�rrr�_do_clear_lock�s4��
�	
���

z"clear_lock.<locals>._do_clear_lockrPr)�rWrXrh�fnmatchrir_rE�extend)ryr�Zclearedr�r)r�r�rrr�
clear_lock�s0	

	���
r�c	Cr�)z�
    Place an update.lk

    ``remote`` can either be a dictionary containing repo configuration
    information, or a pattern. If the latter, then remotes for which the URL
    matches the pattern will be locked.
    c
Ss�g}g}tj�|d�sdztjj�|dd��Wd�n1s"wYWn*tyR}zd�|d|d|�}t	�
|�|�|�WYd}~||fSd}~wwd�|d�}t	�
|�|�|�||fS)NrR�wz*Unable to set update lock for {} ({}): {} rPzSet lock for {})rSrr�rrrmrnrtrsrr�rp)r)r�r�r{r�rrr�_do_lock
s*���
��

zlock.<locals>._do_lockrPr)r�)ryr��lockedr�r)r�r�rrr�locks0	

	���
r�c
Cs�ddd�}t�\|d<}|D]�}z�tj�|d�r,t�d|d|d�W|d��qt|�\}}|rDt�d	|d�W|d��qt�	d
|d�|d�
�|d��}z|d��Wnt
y~}ztjd||dtjd�WYd
}~nd
}~ww|d��}|d|dkr�d|d<W|d��n|d��wt|�qtj�tdd�}|�dd�dus�tj�|�s�tj�|�}	tj�|	�s�t�|	�tdd�}
tjj�|d��}|�tj�|
��t�d|�Wd
�n1s�wYt�dd��r*tjj j!dtdtdd��}|�"|t#ddgdd��Wd
�n	1�s%wYztj$�%tj�tdd�t&�Wd
St'�yFYd
Sw)z0
    Execute an hg pull on all of the repos
    Fr)�changedZbackendr�rRz�Update lockfile is present for hgfs remote %s, skipping. If this warning persists, it is possible that the update process was interrupted. Removing %s or running 'salt-run fileserver.clear_lock hgfs' will allow updates to continue for this remote.rPr)z7Unable to set update lock for hgfs remote %s, skipping.zhgfs is fetching from %sz1Exception %s caught while updating hgfs remote %s)Zexc_info_on_loglevelNrTrD�hgfs/envs.p)�ignore_cachezwb+zWrote env cache data to %sZfileserver_eventsZmasterZsock_dir)�opts�listenr^�
fileserver)�prefix�	hgfs/hash)(r�rSrr�r�warningrir�rr�rjr7Zpullrk�logging�DEBUGr�rTrr]�isfile�dirnamerer<rrrmrnro�payload�dumpsZtrace�eventZ	get_eventZ
fire_eventrr�Zreap_fileserver_cache_dir�	find_filert)
r[rwr)�_r�Zcurtipr{Znewtip�	env_cacheZenv_cachedirZnew_envsr}r�rrrr^5s�
�	�������


���
��r^cCstjjj|tdtdd�S)zc
    Check if an environment is exposed by comparing it against a whitelist and
    blacklist.
    Zhgfs_saltenv_whitelistZhgfs_saltenv_blacklist)Z	whitelistZ	blacklist)rrr"Zcheck_whitelist_blacklistr)�envrrr�_env_is_exposed�s
�r�c		Cs|stj�tdd�}tj�t|�}|dur|St�}t�D]e}z\|d�	�|ddvrGt
|d�D]}|d}||dkrAd}|�|�q3|dd	vrgt|d�D]}|d}||dkrad}|�|�qS|�
d
d�t|d�D��W|d��q|d��wdd�t|�D�S)
z@
    Return a list of refs that can be used as environments
    rDr�Nr)r)rr	rr
)rr	cSsg|]}|d�qSr-rr$rrrr'�szenvs.<locals>.<listcomp>cSsg|]}t|�r|�qSr)r�r$rrrr'�s)rSrrTrrr�Zcheck_env_cache�setrhrjr*�addr5r^r:ri�sorted)	r�r��cache_match�retr)�branchZbranch_nameZbookmarkZ
bookmark_namerrrr<�s0
r<r
c
Ks:ddd�}tj�|�s|t�vr|Stj�tdd||�}tj�tdd||�d��}tj�tdd||�d��}tj�tdd||�d��}tj�|�}tj�|�}	tj�|�stzt�|�Wnt	yst�
|�t�|�Ynwtj�|	�s�zt�|	�Wnt	y�t�
|	�t�|	�Ynwt�D�]�}
�zw|
d	r�|�|
d	tjj
�s�W|
d
��q�|t|
d	�d��tjj
�}|
dr�tj�|
d|�}|
d
��t|
|�}|s�|
d
��W|
d
��q�tj�||�tj�|��rDtj�|��rDtjj�|d
��3}
|
��}||dk�r4||d<||d<|
d
��|Wd�W|
d
��SWd�n	1�s?wYz|
d
jtjj�d|���g|d|d�Wntjj �yt|
d
��YW|
d
��q�wtjj�|d��Wd�n	1�s�wYt!�!|�D]}zt�
|�W�q�t"�y�Y�q�wtjj�|d��}
|
�#tjj�$|d��Wd�n	1�s�wYzt�
|�Wn
t	�y�Ynw||d<||d<zt%t�&|��|d<Wn
t"�y�YnwW|
d
��|SW|
d
��|S|
d
��w|S)z
    Find the first file to match the path and ref, read the file out of hg
    and send the path to the newly cached file
    �)r�relrDz	hgfs/refsr�z.hash.*z.hash.blob_sha1z.lkrr)Nr
�rr r�rzpath:)r?�outputr�rN�stat)'rSr�isabsr<rTrr�rdrertrrh�
startswith�sepri�len�lstriprjr=rr�Z	wait_lockr�rrmrn�read�catr"�to_bytesrgrrl�globrkror#�listr�)rZtgt_env�kwargs�fnd�destZhashes_globZblobshadestZlk_fnZdestdirZhashdirr)Z	repo_pathrAr}Zsha�filenamerrrr��s�
��
�
��=�
3��0��
�!�����
����r�csd�vr	��d�ddd�}t�fdd�dD��s|S|ds!|S|d|d	<��d
d�}tj�|d�}tjj�	|d��9}|�
�d
�|�td�}|rZtjj�
|�sZ|�t�}|rj|rjtjj�||�}||d
<||d<Wd�|S1sywY|S)z?
    Return a chunk from a file based on the data received
    r�r�)r[r�c3��|]}|�vVqdSrIrr$��loadrrrK ��zserve_file.<locals>.<genexpr>)r�loc�saltenvrr�r��gzipN�rbr�Zfile_buffer_sizer[)�pop�allr]rSr�normpathrrrmrn�seekr�rZ	is_binary�decodeZ__salt_system_encoding__Z	gzip_util�compress)r�r�r�r�Zfpathr}r[rr�r�
serve_files.




�	�	r�c
s"d�vr	��d�t�fdd�dD��sdSdtdi}|d}|d}tj�td	d
�dd�|td��}tj�|�sktj	j
�|td�|d
<tj	j�
|d��}|�|d
�Wd�|S1sdwY|Stj	j�
|d��}tj	j�|���|d
<Wd�|S1s�wY|S)zL
    Return a file hash, the hash type is set in the master config file
    r�c3r�rIrr$r�rrrK;r�zfile_hash.<locals>.<genexpr>)rr�r�rLr�rrDr�r�z
{}.hash.{}ZhsumrNNr�)r�r�rrSrrTrsr�rrZ	hashutilsZget_hashrmrnror"Z
to_unicoder�)r�r�r��relpathrZhashdestr}rr�r�	file_hash3s6
�
��
��r�c	Cs�d|vr	|�d�tj�tdd�}tj�|�s1zt�|�Wnty0t�	d|�gYSwtj�|d�
|d��}tj�|d�
|d��}tj�
t|||�\}}}|dur[|S|r|i}t|�|d	<t|�|d
<|rvtj�t|||�|�|g�SgS)zD
    Return a dict containing the file lists for files and dirs
    r�rDr�zUnable to make cachedir %sz{}.pr�z.{}.wNrm�dirs)r�rSrrTrrdrertr�criticalrsrr�Zcheck_file_list_cache�_get_file_list�
_get_dir_listZwrite_file_list_cacher])	r�Zformr�Z
list_cacheZw_lockr�Z
refresh_cacheZ
save_cacher�rrr�_file_listsQs2
�
�r�cC�
t|d�S)zR
    Return a list of all files on the file server in a specified environment
    rm�r�r�rrr�	file_listr�
r�c	Cs�d|vr	|�d�d|vs|dt�vrgSt�}t�D]I}z@|d��t||d�}|rWt|d|d�}|D]}tj�	|d|d�}|�
d�sV|�tj�|d|��q8W|d�
�q|d�
�wt|�S)	zO
    Get a list of all files on the file server in a specified environment
    r�r�r)�rAr>r
�../r)r�r<r�rhrjr=rBrSrr�r�r�rTrir�)r�r�r)rAr@�tupr�rrrr�ys$


�r�cCsgS)z>
    Return a list of all empty directories on the master
    rr�rrr�file_list_emptydirs�sr�cCr�)z8
    Return a list of all directories on the master
    r�r�r�rrr�dir_list�r�r�c		Cs.d|vr	|�d�d|vs|dt�vrgSt�}t�D]k}zb|d��t||d�}|ryt|d|d�}|D]@}|d}|�dd�}t|�dkrxt	j
�|d|d	�}|d
krj|�d�sj|�
t	j
�|d|��|d�dd�}t|�dksJq8W|d��q|d��w|dr�|�
|d�t|�S)
z5
    Get a list of all directories on the master
    r�r�r)r�r>�/rrr
�.r�r)r�r<r�rhrjr=rB�rsplitr�rSrr�r�r�rTrir�)	r�r�r)rAr@r��filepath�splitr�rrrr��s4


��	r�rI)F)r
)>�__doc__rUr�r�r�rar�rSr�rZsalt.fileserverrZsalt.utils.dataZsalt.utils.filesZsalt.utils.gzip_utilZsalt.utils.hashutilsZsalt.utils.stringutilsZsalt.utils.urlZsalt.utils.versionsZsalt.configrZsalt.exceptionsrZsalt.utils.eventrrrJrgr�ImportError�	getLogger�__name__rrZ__virtual_aliases__rr*r3r5r6r:r;r=rBrCrhr�r�r�r�r^r�r<r�r�r�r�r�r�r�r�r�rrrr�<module>sx%�



&&

>4M

 i!