茉莉花渣对广西黑山羊生长性能及血清生化、抗氧化和免疫指标的影响
,HUANG Heng,DENG Fuchang,LU Junzhi,LI Yehong,LIANG Qiong,HUANG Huali, HU Junjie,JIANG Huimin,WANG Jinxing, XIAO Peng,YANG Xiaogan,LIANG Xingwei, ZENG Jun
Heilongjiang Animal Science and Veterinary Medicine(2023)
广西大学 | 广西桂垦那梭牧业有限公司
Abstract
为了探究茉莉花渣对广西黑山羊生长性能及血清生化指标、抗氧化和免疫指标的影响,试验将28只体重为24 kg左右的4月龄健康广西黑山羊随机分为两组(对照组和试验组),每组14只,对照组饲喂基础日粮,试验组饲喂用10%茉莉花渣替代基础日粮中6%玉米和4%豆粕的试验日粮,两组日粮营养水平相近,精粗比均为43:57,预试期7 d,正试验30 d;试验结束后,比较两组的生长性能(终末体重、总增重、平均日增重、平均日采食量和料重比)、经济效益(增重收益和毛收益)及血清生化指标[总蛋白(TP)、白蛋白(ALB)、球蛋白(GLB)水平,丙氨酸氨基转移酶(ALT)、天冬氨酸氨基转移酶(AST)和碱性磷酸酶(ALP)活性及总胆红素(TBIL)、直接胆红素(DBIL)、尿素氮(BUN)、葡萄糖(GLU)、三酰甘油(TG)、胆固醇(CHOL)、钙离子、镁离子、铁离子、磷离子、低密度脂蛋白(LDL-C)、高密度脂蛋白(HDL-C)浓度]、抗氧化指标[总抗氧化能力(T-AOC),总超氧化物歧化酶(T-SOD)、过氧化氢酶(CAT)、谷胱甘肽过氧化物酶(GSH-Px)活性及丙二醛(MDA)浓度]和免疫指标[免疫球蛋白A(IgA)、免疫球蛋白M(IgM)、免疫球蛋白G(IgG)、白细胞介素-1β(1L-1β)、白细胞介素-2(IL-2)、白细胞介素-4(IL-4)、白细胞介素-6(IL-6)、白细胞介素-10(IL-10)、肿瘤坏死因子α(TNF-α)、γ干扰素(IFN-γ)、补体蛋白3(C3)、补体蛋白4(C4)、酸性磷酸酶(ACP)和溶菌酶(LZM)水平].结果表明:两组的终末体重、总增重、平均日增重、平均日采食量和料重比均差异不显著(P>0.05),但试验组的增重收益和毛收益均高于对照组,分别提高了 11.81%和18.08%.试验组血清ALP活性、钙离子浓度、T-AOC和T-SOD活性及IgA、IgM、IgG、IL-4、C4水平显著高于对照组(P<0.05),BUN和MDA浓度显著低于对照组(P<0.05),其他血清生化指标、抗氧化指标和免疫指标在两组间差异不显著(P>0.05).说明用10%茉莉花渣替代基础日粮中6%玉米和4%豆粕的试验日粮饲喂广西黑山羊,虽然对生长性能的影响不大,但是能明显提高养殖经济效益和抗氧化能力、免疫能力,且对血清生化指标没有负面影响.
More
PPT
View via Publisher
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
Pretraining has recently greatly promoted the development of natural language processing (NLP)We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performanceWe propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generationThe model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in ChineseExperimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performanceUpload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn